Facebook chief executive Mark Zuckerberg contacted House Speaker Nancy Pelosi (D-Calif.) in recent weeks to discuss how his company handles viral misinformation, but she has not called him back or personally replied, according to two people familiar with the exchanges.
Pelosi’s decision not to engage with Zuckerberg, one of the most powerful technology executives in the world, reflects her frustration with how Facebook handled a manipulated video clip of remarks by the speaker, said the people, who spoke on the condition of anonymity because they were not authorized to speak publicly.
The video of Pelosi’s speech at a Center for American Progress event last month, in which she said President Trump’s refusal to cooperate with congressional investigations was tantamount to a “coverup,” was edited to make her voice sound garbled and warped. The video circulated on several social platforms, including Facebook, which refused to take it down.
According to the people familiar with the matter, Pelosi has not been eager to hear Zuckerberg’s explanation for the company’s actions.
The impasse between the nation’s most powerful Democratic lawmaker and the social media titan highlights broader tensions within the Democratic Party about Facebook and the company’s efforts to counter foreign interference in elections as well as the spread of falsehoods.
A spokesman for Pelosi declined to comment. Facebook declined to comment on Zuckerberg’s private communications.
A growing number of Democratic presidential candidates have been calling for increased regulation of Facebook and other technology companies, with some contenders seeking to break up the companies and convert them into federal utilities. Earlier this month, the House’s top antitrust subcommittee announced a sweeping investigation of whether powerful Silicon Valley firms are stifling competition and putting consumers at risk. The Justice Department and the Federal Trade Commission are also probing the companies.
Amid those investigations, Zuckerberg has called for clearer regulation of technology giants, saying the companies shouldn’t make certain decisions without a more “standardized approach.”
On Thursday, the House Intelligence Committee — which Pelosi has encouraged to investigate Russian interference in the 2016 presidential election and the use of social media by foreign agents and affiliated organizations — will hold a hearing on “manipulated media and ‘deepfake’ technology.”
Videos known as “deepfakes” are altered using artificial intelligence to create fictional or misleading content. Technology and political campaign experts say the rise of manipulated media is a significant risk in the 2020 elections, and major companies are at odds over how to handle it.
Most political campaigns depend on Facebook and Google to reach potential voters through targeted ads — and Silicon Valley is one of the Democratic Party’s hot spots for securing big donations.
But Pelosi, who is friendly with many technology sector leaders, nevertheless remains unhappy with Facebook’s decision in late May to not remove a video from its platform that was manipulated to make her seem drunk, according to the people familiar with the matter.
Although Pelosi hasn’t responded to Zuckerberg’s overtures, her staff and his staff have been in touch, the people said.
One version of the manipulated video, posted by the conservative Facebook page Politics WatchDog, was viewed millions of times. Rudolph W. Giuliani, Trump’s attorney, even tweeted a link to the altered video with the note: “What is wrong with Nancy Pelosi? Her speech pattern is bizarre.” The tweet has since been deleted.
The altered video’s dissemination highlights the way viral misinformation could shape public perceptions in the run-up to next year’s election. People who spread misinformation don’t need sophisticated technology to go viral, and crude manipulations can be used to undermine an opponent.
In declining to remove the video, Facebook cited a long-standing policy of not banning false information. While Lead Stories and PolitiFact, two independent fact-checking groups that have contracts with the social network, deemed the video “false,” Facebook said in a statement that it does not have “a policy that stipulates that the information you post on Facebook must be true.”
The company said it instead would “heavily reduce” the video’s appearances in people’s news feeds, append a small informational box alongside the video linking to the two fact-checking sites and open a pop-up box linking to “additional reporting” whenever someone clicked it.
Facebook executives recognize that the company’s approach of waiting for fact-checkers to verify information often leads to significant delays in getting accurate information to the public, and that a video can go viral well before a fact-checker reviews it, according to a person familiar with the company’s deliberations who was not authorized to speak publicly. They are also aware that even when it is reviewed by fact-checkers, the “additional reporting” notice provided to Facebook users may not go far enough in explaining the nature of the deception.
But they also think an outright ban on manipulated media would lead to more accusations of censorship and bias, as well as enforcement challenges, the person said.
While he declined to directly address the steps the company is considering, Facebook spokesman Andy Stone said it was exploring how to improve its approach.
“Leading up to 2020, we know that combating misinformation is one of the most important things we can do. We continue to look at how we can improve our approach and the systems we’ve built. Part of that includes getting outside feedback from academics, experts and policymakers,” Stone said, noting that the company is funding studies of manipulated media at several universities.
After the distorted Pelosi video surfaced on Facebook, executives forcefully defended the company’s response.
“Anybody who is seeing this video in their news feed, anybody who is going to share it to somebody else, anybody who has shared it in the past, they are being alerted that this video is false,” Monika Bickert, Facebook’s head of product policy and counterterrorism, told CNN’s Anderson Cooper in an interview last month.
When Cooper pressed her and said that Facebook is “making money by being in the news business” and suggested it had an obligation to remove the video, Bickert said, “We are not in the news business; we are in the social media business.”
That response didn’t satisfy lawmakers such as Rep. David N. Cicilline (D-R.I.), who leads the House Judiciary Committee’s antitrust subcommittee. He took to Twitter to demand that Facebook “fix this now!”
Pelosi has also rebuked Facebook in the wake of the episode and questioned its integrity and management.
“We have said all along, ‘Poor Facebook, they were unwittingly exploited by the Russians,’ ” Pelosi told KQED radio in San Francisco last month. “I think wittingly, because right now they are putting up something that they know is false.”
Since the controversy, other technology companies have taken steps to expand regulation of content on their platforms.
YouTube, the Google-owned video site, said last week that it would remove false videos alleging that events like the Holocaust didn’t happen, as well as an array of content by white supremacists and others, in a move to more aggressively crack down on hate speech. YouTube also took down the Pelosi clip, citing a policy of banning content intended to deceive. Twitter left it up.
Meanwhile, Facebook has told the U.S. government that it is willing to submit to greater oversight of its data-collection practices — from the launching of new services to the decisions of its top executives — to end a federal probe into privacy abuses that came to light last year.
The proposed concessions are part of talks between the tech giant and the Federal Trade Commission, according to a person with knowledge of the matter who spoke on the condition of anonymity because the conversations are private. The changes would accompany a record-breaking, multibillion-dollar fine that the FTC has considered levying against Facebook.
Zuckerberg, who has long shied away from visits to Washington, has become more engaged there since Facebook has faced greater congressional and federal review. Last year, he testified before Congress and has made several personal calls to lawmakers over the past few years, an associate said.
For instance, Zuckerberg called Sen. Mark R. Warner (D-Va.) in the fall of 2017, a few weeks before closely watched hearings on Russian interference in the 2016 election, to tell him that the company would release copies of Russian ads to Congress, according to two people familiar with the conversation. Facebook had not shared the content of the ads at the time.
Democrats, however, continue to explore further regulation of technology giants, as Zuckerberg’s efforts to build personal relationships do little to stave off that attention.
Pelosi wrote on Twitter on June 3, days after the uproar over the video, that she supports more regulation of technology companies: “Unwarranted, concentrated economic power in the hands of a few is dangerous to democracy — especially when digital platforms control content. The era of self-regulation is over.”
Those comments echo what Pelosi said in an April interview with the Recode Decode podcast. “In the [United Kingdom], as you know, they’ve said the era of self-regulation of these companies is over,” she said then. “It probably should be” in the United States.
Pelosi added, “I think we have to subject it all to scrutiny and cost-benefits and all that, but I do think that it’s a new era.”
Drew Harwell and Tony Romm contributed to this report.