The meeting at Facebook’s Menlo Park, Calif., headquarters represents a new overture by the technology industry to develop closer ties to law enforcement to prevent abuse of social platforms. The nation’s top intelligence chiefs declared in February that the Kremlin is continuing its effort to disrupt the U.S. political system and to target the midterm elections. Director of National Intelligence Daniel Coats said at the time that operatives plan to use propaganda, false personas, and bots to undermine the upcoming election.
Guy Rosen, a top Facebook security executive, recently told The Washington Post that the social network has not yet found evidence of interference by the Internet Research Agency, the St. Petersburg-based organization that employed dozens of online trolls to manipulate social media during the 2016 presidential campaign, or by other Russian operations such as GRU, Russia’s military spy agency. “We’re constantly looking for more activity,” he said. “We’re running down a lot of investigations.”
Facebook confirmed the meeting but declined to comment further. The other companies as well as the FBI and DHS declined to comment or didn’t respond to requests for comment.
An invitation for the “election-protection” meeting from Facebook Chief Security Officer Alex Stamos said it would focus on “practical ways”the companies could most effectively collaborate with law enforcement, including identifying appropriate points of contact and creating clear communication channels, according to a copy reviewed by The Post.
Tech companies say they need law enforcement help because the private sector is not always aware of threats picked up on by intelligence agencies.
Though federal agencies and Silicon Valley firms have communicated for years about issues such as child pornography and terrorist content, discussions about Russian interference did not take place during the 2016 campaign. The extent of the Russian effort to sow discord on Facebook, Twitter, Google, Reddit, and other online services was only apparent months after the election.
Russian attempts to spread divisive messages on social media do not appear to have abated since then. Russian influence has appeared to crop up during polarizing public events, such as the February massacre at a high school in Parkland, Fla., and during a congressional fight the previous month. As a debate over gun control raged in the days following the school shooting, automated accounts, or bots, appeared to support pro-gun messages on Twitter, according to Hamilton 68, a website created by the Alliance for Securing Democracy, which is affiliated with the German Marshall Fund, that tracks pro-Kremlin messages on Twitter.
But U.S. officials say the scope of meddling does not seem as broad as what took place two years ago.
“We haven’t seen any real activities along the lines of what Russia did in 2016, but I don’t need to see that to do something,” said Christopher C. Krebs, undersecretary of Homeland Security’s National Protection and Programs Directorate, who attended the meeting at Facebook. “We’re full speed ahead. And the good news is the state and local election officials take this very seriously. They’re very much engaged.”
The lack of cooperation between Facebook and law enforcement during the 2016 presidential election resulted in potential missed opportunities to counter manipulation of the social media platform. In June 2016, cybersecurity experts at the company were tracking a Russian hacker group known as APT28, or Fancy Bear, which U.S. intelligence officials considered an arm of GRU, according to people familiar with Facebook’s activities.
Facebook executives voiced their suspicions about a Russian spying operation with the FBI on two occasions, a person familiar with the matter said. But the company didn’t hear back from U.S. officials, the people said.
Facebook’s experts assumed the hackers were following their usual tactics of stealing military plans and data from political targets, not participating in a far-reaching disinformation campaign designed to shape the outcome of the U.S. presidential race.
But the FBI at the time was tracking Russian government trolls who assumed fake identities to pen articles aimed at stoking divisions in American society. The bureau did not notify the social media firms or the publications in which the trolls got their work printed.
Facebook executives grew frustrated that intelligence officials didn’t help prepare them for the threat, according to people familiar with the matter. They have complained about what they say is a lack of contact from law enforcement, despite repeated requests during the months following the election for guidance.
Free speech concerns were one reason for the lack of outreach. “We can’t just go to a Twitter, Google or Facebook and say, ‘Please shut down the account because we don’t like the content … we find this information offensive … or we find it’s coming from a false persona,’” said one former senior law enforcement official explaining the constraints on the FBI. “We’re not the thought police.”
After Facebook disclosed it had uncovered Russian ads on its platform, in September 2017, the company began conversations with lawmakers and provided data to special counsel Robert S. Mueller III. Mueller, who indicted the Internet Research Agency earlier this year.
Silicon Valley firms have also started to talk more with one another and share data through Qintel, a Pittsburgh-based cybersecurity company that maintains a large database of website registrations, botnets and compromised credentials collected from bad actors, according to people familiar with the process.
At the meeting last month, FBI officials flew in from Washington and discussed high-level efforts by the bureau’s Foreign Influence Task Force set up last year to counter efforts by adversaries such as Russia to interfere in U.S. elections and democratic processes.
Department of Homeland Security officials updated the companies on their efforts to help state and local officials secure election infrastructure.
Facebook shared information on its effort to combat election interference in the U.S. and around the world, which includes more human oversight, artificial intelligence tools and the use of five independent fact-checking organizations in the United States.
No intelligence and no classified material was shared at the meeting, according to a person familiar with the discussions. The participants agreed there were several key areas of concern that merited further discussion, including attacks on individuals, and any U.S. government discovery of foreign groups or campaigns attempting to influence democracy or elections.
“It was a back-and-forth, with both sides talking about how they were thinking about the problem and how we were looking for opportunities to work together,” said one participant, who spoke on the condition of anonymity to describe a private meeting.