Mourners attend a vigil after the Pulse nightclub shooting in Orlando in June. (Melissa Lyttle for The Washington Post)

In June, a gunman killed 49 people and wounded 53 others in a horrific spate of violence at a gay nightclub in Orlando. Now, the families of some of the victims are suing Google, Twitter and Facebook, arguing that the tech companies had a role in the shooter's radicalization.

The families are accusing the companies of providing support to the Islamic State, the terrorist organization that appeared to inspire the attack. Although the gunman, Omar Mateen, did not appear to have official ties to the Islamic State, also referred to as ISIS, the victims' families say the group's indirect influence over the gunman is at least partly attributable to its “unfettered” ability to recruit fighters on social media.

Through their data-driven business models, companies such as Google, Twitter and Facebook even “profit from ISIS postings through advertising revenue,” according to the lawsuit, which was filed in a Michigan federal court Monday and was first reported by Fox News. The families of Tevin Eugene Crosby, Juan Guerrero and Javier Jorge-Reyes are demanding a trial and unspecified monetary compensation.

“Without … Twitter, Facebook, and Google (YouTube), the explosive growth of ISIS over the last few years into the most feared terrorist group in the world would not have been possible,” the lawsuit reads.

The suit is not the first to target tech companies over ISIS's influence. In a pair of separate court proceedings earlier this year, the families of people killed in the 2015 Paris attacks and in an incident in Jordan accused Twitter of providing material support to the terrorist group. In both cases, Twitter argued that it could not be sued because it was simply acting as a conduit for online speech, rather than a speaker for ISIS itself. The judge in the case about the Jordanian incident appeared to side with Twitter, dismissing the suit and saying that Twitter is protected by the Communications Decency Act of 1996. The other case, which also names Google and Facebook, is pending. (Both were filed in California.)

Section 230 of the act shields “online intermediaries that host or republish speech … against a range of laws that might otherwise be used to hold them legally responsible for what others say and do,” according to the Electronic Frontier Foundation, a digital rights organization. The result, according to EFF, has been the flourishing of an Internet economy that includes Yelp, YouTube and Amazon — where individual users are held responsible for their own conduct on the platforms.

A lawyer representing the Orlando shooting victims' families said that Monday's filing contains a new claim that no lawsuit has tried before. It alleges that tech companies create new content that does not enjoy the protections of Section 230 when they marry advertising with an ISIS supporter's online posts. “While they didn't create the ad, and they didn't create the posting, by putting those things together, they created specific unique content,” said Keith Altman, an attorney at Excolo Law.

It's unclear whether this argument will fare any better than the earlier ones. Courts have traditionally held that online platforms are not legally liable for the content they host, according to ISIS experts. It's “in the same way that a telephone company is not responsible if you use your phone to hire a contract killer,” said J.M. Berger, a fellow with the International Center for Counter-Terrorism. “It's very unlikely that this kind of lawsuit will succeed under current laws.”

But, Berger added, Congress could seek to change those laws if it saw fit. “In our brave new world starting January, who knows?” he said.

Further lawsuits accusing social media platforms of promoting ISIS may naturally peter out if judges continue to invoke the Communications Decency Act, according to a spokesperson for one of the tech companies named in Monday's suit. The person spoke on the condition of anonymity to discuss ongoing legal proceedings.

“These types of cases are always going to get press coverage,” the person said. “But I think there will be certain diminishing returns this time, because this is at least the third case like this, and the first two aren't going anywhere.”

Facebook said in a statement that it is committed to providing a safe experience for its users. “Our Community Standards make clear that there is no place on Facebook for groups that engage in terrorist activity,” the company said, “or for content that expresses support for such activity, and we take swift action to remove this content when it’s reported to us. We sympathize with the victims and their families.”

The other company did not immediately respond to a request for comment.