That struggle has been evident at the social network in recent weeks. Over the weekend, its leaders scrambled to manage Facebook's message in the wake of an indictment by special counsel Robert S. Mueller III that laid out how Russian operators used Facebook and other social media platforms to manipulate American voters.
And just days before the indictment, another Facebook executive disclosed that the company wasn't yet sure how to put in place Zuckerberg's latest major directive: shifting the company’s metrics so that “meaningful interactions” are valued over likes and clicks, a response to the misinformation and reports about the harms of social media that drew attention last year.
“Even if [Zuckerberg] says, 'Resolve this right away,' ” the problems are baked into the fundamentals of the platform,” said Jonathon Morgan, chief executive of New Knowledge, a company that tracks disinformation. “It's not like Mark Zuckerberg just comes to the floor, makes a command, and everything turns around. The changes are a real threat to the way that these people think about success at their jobs.”
Almost immediately after the indictment landed, Facebook's vice president of global policy, Joel Kaplan, took advantage of an opportunity to appear cooperative with the regulators and critics that the company has clashed with in recent months over how forthcoming Facebook was. He boasted that “today's news confirms our announcement last year that foreign actors conducted a coordinated and sustained effort to attack our democracy,” and that the company had readily handed over information on Russian interference to authorities. He said he was “grateful that the US government was now taking this aggressive action against those who abused our service.”
But hours later, another Facebook vice president, Rob Goldman, who runs the company's massively lucrative ad business, seemed more defensive about Facebook’s role. “I have seen all of the Russian ads and I can say very definitively that swaying the election was *NOT* the main goal,” he tweeted, adding that the majority of Russian ads ran after the election. “We shared that fact,” he wrote, “but very few outlets have covered it because it doesn’t align with the main media narrative of Tump and the election.”
Researchers and experts pounced on Goldman, pointing out that the bigger challenge for Facebook did not relate to ads but to the free content posted by Russians that the company has said reached nearly half the U.S. population — 10 times the number of people who saw the ads. (Facebook has not said whether the majority of the free posts appeared after the election.) Others suggested that his conclusions contradicted the special counsel’s indictment, which found that election meddling on Facebook was indeed a priority of the Russian operatives.
The comments from Goldman, who had 1,600 followers at the time, could have been a standard Twitter debate between professionals — until President Trump broadcast the comments to 48 million users the following day. Goldman now has more than 11,000 followers.
The controversy sent Facebook executives scrambling over the weekend. They tried to recast the statements, emphasizing that Goldman was speaking for himself, without prior approval. Kaplan, trying to put an end to the matter, issued an additional company statement saying that “nothing we found contradicts the Special Counsel’s indictments. Any suggestion otherwise is wrong.” They vented internally that Goldman had damaged hard-won credibility with the public. By Monday, Goldman, posting on Facebook this time, had issued an apology to his colleagues.
Goldman is part of a social-media-oriented culture that is permissive of employees having a voice and becoming “thought leaders” in their field. Several Facebook executives are regular Twitter users, including Andrew Bosworth, who initially shared Goldman’s tweet.
But that culture is increasingly pushing up against the need of big tech companies such as Facebook to defend their credibility. Marc Andreessen, a Facebook board member, shut his popular Twitter account after making offensive comments about Facebook’s goals in India. After Google fired a conservative engineer for making derogatory statements about women, employees took to social media to protest both sides of the issue. Engineers across Silicon Valley pushed their chief executives to speak out against Trump’s immigration ban.
Recently, a Facebook executive spoke publicly about the challenges of implementing Zuckerberg's latest directive to make the product less harmful by measuring time well spent and meaningful interactions.
“We’re trying to figure out how to best measure and understand that,” Adam Mosseri, Facebook vice president who manages the company’s news feed, said at an industry conference. “The metric is definitely evolving.”
Later, Mosseri took to Twitter to explain himself further. In response to a question from a Washington Post reporter, Mosseri said the company was considering placing more emphasis on comments and messages, particularly long ones, after surveys found that users reported those features to be meaningful to them. He said the company was trying to define how to give people news that is genuinely informative. Currently, the company shows people highly personalized content that its algorithm predicts they will find informative.
“But there is a real difference between feeling informed and being informed,” he wrote. “We have yet to work out a way to do the latter.”