The display demonstrated just how deep the desire in Washington goes to change how social media companies operate — while also underlining the lack of consensus on how exactly to do that. Some lawmakers proposed new legislation, while others called for reforming Section 230 of the Communications Decency Act, a decades-old law that shields tech companies from lawsuits stemming from the content users post on their sites.
“The power of this technology is awesome and terrifying, and each of you has failed to protect your users and the world from the worst consequences of your creations,” said Rep. Mike Doyle (Pa.), the top Democrat on a House Energy and Commerce panel focused on technology.
The hearing was the first time Facebook’s Mark Zuckerberg, Twitter’s Jack Dorsey and Google’s Sundar Pichai appeared before Congress since the Jan. 6 attack on the Capitol, which exploded out of a vortex of false claims spread by lawmakers and right-wing media figures that the 2020 presidential election had been rigged against President Donald Trump.
Doyle asked all of the CEOs if their companies were partly responsible for the Capitol riot, pressing them to answer in a yes or no format. Only Dorsey said yes.
Still, the companies fielded only a handful of questions on the topic, and the expulsion by the companies of Trump in the aftermath of the attack barely came up.
The executives aren’t new to testifying before Congress. Last summer, the CEOs of Facebook, Google, Amazon and Apple were grilled over antitrust concerns. And late last year, Republicans called the same trio to testify specifically on Section 230.
But the rapid-fire question-and-answer format appeared at times to prompt the executives to stumble, and even occasionally drew some interesting answers. Pichai, for example, turned out to be the only one of the three who had so far been vaccinated and who had seen the film “The Social Dilemma,” a Netflix documentary that seeks to unveil the addicting and sometimes dangerous aspects of social media.
Republicans, as they have in previous hearings, accused the companies of censoring conservative voices. But they also demanded that the tech companies do more to protect children and teens from cyberbullying and social media addiction. Several Democrats picked up on the same thread, accusing the CEOs of Google and Facebook of making money by advertising to children who technically aren’t allowed on their platforms.
The current law, Children’s Online Privacy Protection Act, prohibits companies from collecting the data of children under 13 in most circumstances or targeting them with personalized advertising.
“Of course, every parent knows that kids under the age of 13 are on Facebook and Instagram,” said Rep. Kathy Castor (D-Fla). “The problem is that you know it. And you know that the brain and social development is still evolving at a young age. There are reasons in the law that we said that cutoff is at 13.”
Legislators also questioned why Facebook and Google have created platforms for children. BuzzFeed reported last week that Facebook is planning an Instagram for children. The two companies also have Facebook Messenger Kids and YouTube Kids, respectively.
Some critics have said that those types of underage-targeted services are aimed at getting children hooked on social media, early.
Pichai and Zuckerberg said children under 13 aren’t allowed on the platform, so they don’t make money off them.
The original hearing topic of misinformation still loomed large, and the politicians managed to delve into levels of detail that had been missing from previous hearings on the subject. Several asked specifically about covid misinformation spreading through Latino communities. Others cited recent data on a rise in anti-Asian online hate and asked why hashtags like “#chinesevirus” weren’t banned.
Both of those topics forced answers from the CEOs that were more direct than the usual “I’ll get back to you,” a cliche of these events. Zuckerberg defended Facebook’s record on Spanish-language misinformation and committed to making it a priority. Dorsey said Twitter didn’t block potentially racist hashtags because they could also be used by those who fight back against racists online.
Rep. A. Donald McEachin (D-Va.) asked why Facebook wasn’t applying the same rules regarding covid-19 misinformation to climate change misinformation. Zuckerberg’s answer: Covid lies have the potential to cause “imminent physical harm,” while climate change misinformation doesn’t.
Two lawmakers asked about Google’s recent efforts to reform online advertising — a topic that rarely gets any attention outside of wonky tech advertising circles. The company is making changes to its Chrome browser — the most popular way to access the Internet worldwide — that would make it harder for advertisers to track individuals. Privacy advocates support the move, but antitrust officials in the United States and United Kingdom have said it could quash competition.
Taken together, the long list of questions shows that many of the committee members showed up prepared and knowing what they wanted to get out of the executives.
“The questioning from both sides shows that lawmakers are serious,” said Alexandra Givens, CEO of the Center for Democracy and Technology, a think tank that takes funding from foundations and companies, including Google and Facebook. “How they actually craft a path forward remains to be seen.”
Still, that doesn’t mean that a consensus has emerged on what comes next.
Lawmakers from both parties suggested the time had come to make changes to Section 230, for example. But the two parties want the law changed in opposing ways. Democrats want it to hold companies to a higher standard for the spread of racism and misinformation. Republicans want the companies to cut back on moderation, arguing that current practices threaten free speech.
Politicians have introduced a flurry of bills that would significantly change Section 230, but they have yet to coalesce around a single proposal. Sens. Brian Schatz (D-Hawaii) and John Thune (R-S.D.) have unveiled the PACT Act, which would force companies to be more transparent about content moderation. A group of Democrats in the House and Senate introduced the Safe Tech Act, which aims to hold the tech companies more accountable when posts on their services result in real-world harms.
At Thursday’s hearing, Rep. Anna G. Eshoo (D-Calif.) floated her bill, the Protecting Americans from Dangerous Algorithms Act, that would amend Section 230 to remove tech companies’ protections from lawsuits when their algorithms amplify content that leads to offline violence.
And Rep. Yvette D. Clarke (D-N.Y.) announced plans to introduce a bill that would change Section 230 to prevent discrimination in online advertising.
Zuckerberg brought his own ideas for changing Section 230. The Facebook CEO says new legislation should require the biggest tech platforms to be more transparent about their rules for taking down content, and hold them liable when they fail to block illegal activity when they find it on their platforms. Pichai and Dorsey said they generally agree with Zuckerberg’s proposals, though Dorsey said it would be difficult to enact.
“It’s going to be very hard to determine what’s a large platform and what’s a small platform,” Dorsey said.
Below are the updates from the House hearing.