The Washington PostDemocracy Dies in Darkness

Opinion Who’s responsible when ChatGPT goes off the rails? Congress should say.

The logos of Microsoft and ChatGPT. (Lionel Bonaventure/AFP via Getty Images)
Listen
4 min

The early days of Microsoft’s ChatGPT were something of a replay of internet history — beginning with excitement about these inventions and ending in trepidation about the harm they could do. ChatGPT and other “large-language models” — artificially intelligent systems trained on vast troves of text — can turn into liars or racists or terrorist accomplices that explain how to build dirty bombs. The question is: When that happens, who’s responsible?

Section 230 of the Communications Decency Act says that services — from Facebook and Google to movie-review aggregators and mommy blogs with comment sections — shouldn’t face liability for most material from third parties. It’s fairly easy in these cases to distinguish between the platform and the person who’s posting. Not so with chatbots and AI assistants. Few have grappled with whether Section 230 provides protections to them.

Follow Editorial Board's opinionsFollow

Consider ChatGPT. Type in a question, and it provides an answer. It doesn’t merely surface existing content, such as a tweet, video or website originally contributed by someone else, but rather writes a contribution of its own in real time. The law says a person or entity becomes liable if they “develop” content even “in part.” And doesn’t transforming, say, a list of search results into a summary qualify as development? What’s more, the contours of every AI contribution are informed substantially by the AI’s creators, who have set the rules for their systems and shaped their output by reinforcing behaviors they like and discouraging those they don’t.

Yet at the same time, ChatGPT’s every answer is, as one analyst put it, a “remix” of third-party material. The tool generates its responses by predicting what word should come next in a sentence based on what words come next in sentences across the web. And as much as creators behind a machine inform its outputs, so too do the users posing queries or engaging in conversation. All this suggests that the degree of protection afforded to AI models may vary by how much a given product is regurgitating versus synthesizing, as well as by how deliberately a user has goaded a model into producing a given reply.

Press Enter to skip to end of carousel
Also on the Editorial Board’s agenda
  • The misery of Belarus’s political prisoners should not be ignored.
  • Biden has a new border plan.
  • The United States should keep the pressure on Nicaragua.
  • America’s fight against inflation isn’t over.
  • The Taliban has doubled down on the repression of women.
  • The world’s ice is melting quickly.
Ihar Losik, one of hundreds of young people unjustly jailed in Belarus for opposing Alexander Lukashenko’s dictatorship, attempted suicide but was saved and sent to a prison medical unit, according to the human rights group Viasna. Losik, 30, a blogger who led a popular Telegram channel, was arrested in 2020 and is serving a 15-year prison term on charges of “organizing riots” and “incitement to hatred.” His wife is also a political prisoner. Read more about their struggle — and those of other political prisoners — in a recent editorial.
The Department of Homeland Security has provided details of a plan to prevent a migrant surge along the southern border. The administration would presumptively deny asylum to migrants who failed to seek it in a third country en route — unless they face “an extreme and imminent threat” of rape, kidnapping, torture or murder. Critics allege that this is akin to an illegal Trump-era policy. In fact, President Biden is acting lawfully in response to what was fast becoming an unmanageable flow at the border. Read our most recent editorial on the U.S. asylum system.
Some 222 Nicaraguan political prisoners left that Central American country for the United States in February. President Daniel Ortega released and sent them into exile in a single motion. Nevertheless, it appears that Mr. Ortega let them go under pressure from economic sanctions the United States imposed on his regime when he launched a wave of repression in 2018. The Biden administration should keep the pressure on. Read recent editorials about the situation in Nicaragua.
Inflation remains stubbornly high at 6.4 percent in January. The Federal Reserve’s job is not done in this fight. More interest rate hikes are needed. Read a recent editorial about inflation and the Fed.
Afghanistan’s rulers had promised that barring women from universities was only temporary. But private universities got a letter on Jan. 28 warning them that women are prohibited from taking university entrance examinations. Afghanistan has 140 private universities across 24 provinces, with around 200,000 students. Out of those, some 60,000 to 70,000 are women, the AP reports. Read a recent editorial on women’s rights in Afghanistan.
A new study finds that half the world’s mountain glaciers and ice caps will melt even if global warming is restrained to 1.5 degrees Celsius — which it won’t be. This would feed sea-level rise and imperil water sources for hundreds of millions. Read a recent editorial on how to cope with rising seas, and another on the policies needed to fight climate change.

1/7

End of carousel

So far there’s no legal clarity. Supreme Court Justice Neil M. Gorsuch said during oral argument in a recent case involving Section 230 that AI “generates polemics today that would be content that goes beyond picking, choosing, analyzing or digesting content” — hypothesizing “that is not protected.” Last week, the provision’s authors agreed with his analysis. But the companies working on the next frontier deserve a firmer answer from legislators. And to figure out what that answer should be, it’s worth looking, again, at the history of the internet.

Scholars believe that Section 230 was responsible for the web’s mighty growth in its formative years. Otherwise, endless lawsuits would have prevented any fledgling service from turning into a network as indispensable as a Google or a Facebook. That’s why many call Section 230 the “26 words that created the internet.” The trouble is that many now think, in retrospect, that a lack of consequences encouraged the internet not only to grow but also to grow out of control. With AI, the country has a chance to act on the lesson it has learned.

That lesson shouldn’t be to preemptively strip Section 230 immunity from large-language models. After all, it was good that the internet could grow, even if its maladies did, too. Just like websites couldn’t hope to expand without the protections of Section 230, these products can’t hope to offer a vast variety of answers on a vast variety of subjects, in a vast variety of applications — which is what we should want them to do — without legal protections. Yet the United States also can’t afford to repeat its greatest mistake on internet governance, which was not to govern much at all.

Lawmakers should provide the temporary haven of Section 230 to the new AI models while watching what happens as this industry begins to boom. They should sort through the conundrum these tools provoke, such as who’s liable, say, in a defamation case if a developer isn’t. They should study complaints, including lawsuits, and judge whether they could be avoided by modifying the immunity regime. They should, in short, let the internet of the future grow just like the internet of the past. But this time, they should pay attention.

The Post’s View | About the Editorial Board

Editorials represent the views of The Post as an institution, as determined through debate among members of the Editorial Board, based in the Opinions section and separate from the newsroom.

Members of the Editorial Board and areas of focus: Opinion Editor David Shipley; Deputy Opinion Editor Karen Tumulty; Associate Opinion Editor Stephen Stromberg (national politics and policy, legal affairs, energy, the environment, health care); Lee Hockstader (European affairs, based in Paris); David E. Hoffman (global public health); James Hohmann (domestic policy and electoral politics, including the White House, Congress and governors); Charles Lane (foreign affairs, national security, international economics); Heather Long (economics); Associate Editor Ruth Marcus; and Molly Roberts (technology and society).

Loading...