The Washington PostDemocracy Dies in Darkness

Opinion Amend Section 230 to increase social media’s liability for drug sales on their platforms

Jennifer Stout, vice president of global public policy at Snap Inc., and Michael Beckerman, vice president and head of public policy at TikTok, testify before a Senate subcommittee hearing on consumer protection hearing on Oct. 27. (Samuel Corum/Getty Images)
Placeholder while article actions load

Devin Norring’s younger brother discovered his body.

The 19-year-old from Hastings, Minn., was scheduled to get dental work done on some cracked teeth last April — perhaps the cause of his agonizing migraines — when the pandemic postponed the appointment. To ease the pain, he arranged to buy Percocet over Snapchat. Or so he thought. The pain pill turned out to be laced with fentanyl, a synthetic opioid that can be fatal even in tiny doses.

Snapchat is shielded from any potential liability for Norring’s death because of Section 230 of the Communications Decency Act, the 1996 statute that treats websites as platforms rather than publishers. There are complicated questions about the broader wisdom of Section 230, but there’s no reason to continue to offer platforms a haven for drug-trafficking. Just as Congress made an exception to Section 230 for prostitution in 2018, aimed at shuttering sites such as Backpage, it’s time for lawmakers to craft another carveout for the illegal sale of narcotics.

Norring was among more than 93,000 Americans who died in 2020 of drug overdoses, a surge of nearly 30 percent from the year before. The Partnership for Safe Medicines links Snapchat to the sale of fentanyl-laced counterfeit pills that have caused the deaths of teens and young adults in at least 15 states. NBC identified cases in five additional states and profiled eight victims earlier this month who thought they were buying prescription drugs like Xanax or OxyContin but wound up with fentanyl.

Follow James Hohmann's opinionsFollow

All the major platforms already ban drug-related sales. The companies have hired more moderators, developed artificial-intelligence algorithms and limited searches for keywords related to drugs. But it remains alarmingly easy to procure narcotics online. The Organization for Social Media Safety says its researchers connected with drug dealers on multiple social media sites in under three minutes.

The Drug Enforcement Administration recently issued its first public safety alert since 2015 to warn about the scourge of fake pills online. “The drug dealer isn’t just standing on a street corner anymore,” said DEA Administrator Anne Milgram. “It’s sitting in a pocket on your phone.”

The question of social media platforms’ liability for illegal drug sales was among the topics when TikTok and Snapchat executives testified before Congress for the first time on Tuesday. Sen. Amy Klobuchar (D-Minn.) read from a letter signed by Devin’s mother, Bridgette Norring, and six other parents whose children died from counterfeit pills they got through Snapchat.

The executives insisted they were doing their best. Snap vice president for global public policy Jennifer Stout testified that blocking drug-related content has been a “top priority,” but sellers are “constantly evading our tactics, not just on Snapchat, but on every platform.” TikTok vice president Michael Beckerman said 97 percent of content that violates the app’s terms of service is removed before anyone complains. The latest transparency report from Facebook, which owns Instagram, says its artificial-intelligence systems proactively identified 95.4 percent of drug-related posts and removed 5.5 million posts about drugs during the first quarter of this year.

Klobuchar, a former prosecutor, says poking a hole in the companies’ liability shield will prod them to be as aggressive about drugs as they already are about stopping terrorism and prostitution. “Maybe that’ll make you work even faster,” she told the tech executives, “so we don’t lose another kid.” She told me on Wednesday that requiring greater transparency into algorithms and enabling more safety controls for parents would also help.

This article was featured in the Opinions A.M. newsletter. Sign up here for a digest of opinions in your inbox six days a week.

There is strong bipartisan appetite for prodding tech companies to intensify efforts to stop drug sales. Sen. John Thune (R-S.D.) says he was disturbed by a Wall Street Journal article last month in which reporters set up automated bots to understand what TikTok shows young users. The app served up at least 569 videos about drug use, references to cocaine and meth addiction and promotional videos for online sales of drug products and paraphernalia — to an account registered to a 13-year-old.

It’s impossible to keep everything off the platforms. TikTok claims 1 billion monthly active users, and Snapchat surpasses half a billion, compared to 330 million for Twitter. Viewers watch something like 1 billion hours of video each day on YouTube. The almost unimaginable amount of content means that completely stopping all drug-related activity is like playing an unwinnable game of whack-a-mole.

But the tech giants have hidden behind this excuse for too long. When YouTube vice president Leslie Miller called Section 230 “the backbone of the Internet” during Tuesday’s hearing, Sen. Richard Blumenthal (D-Conn.) retorted that “it’s a backbone without any real spine” because it confers “virtually limitless immunity.”

After the hearing, Blumenthal told me that any changes to Section 230 immunity should require demonstrating that the companies “knowingly or recklessly” tolerate the sale of illegal drugs. As always, the devil is in the details. Blumenthal predicted tough negotiations ahead over the legal standards for what would constitute such behavior.

Congress should take care not to pass overly onerous regulations that will entrench established players. But the problem is too big, and the danger to young people too great, to delay taking action.

Loading...