The White House in recent weeks has been pushing Facebook and its peers — but especially Facebook — to throttle lies and propaganda about the vaccines that could finally put an end to this pandemic, if only more people trusted them. This ambition is understandable. While the shift from several prominent conservative commentators toward encouraging inoculation is promising, many more Internet influencers are continuing to peddle conspiracies. These misinformation mongers themselves are the most to blame. Yet platforms that allow them a place to spread their lies, and whose algorithms sometimes help those lies along, bear some responsibility.
Where the administration is wrong is in suggesting that Facebook hasn’t acknowledged this responsibility. The company has plastered accurate information all over its properties, and it has taken as aggressive a line against inaccurate information about vaccines as it does against any other category of content. The reality is that fighting misinformation is hard because defining misinformation is hard: requiring moderators to find the fine, flickering line between the opinion of someone who says they don’t approve of these vaccines and the provably false assertions of someone who says these vaccines kill.
Where Facebook is wrong is in using these difficulties as an excuse not to try harder. The company disputes the White House’s references to a report on a so-called “disinformation dozen” that finds 12 people were responsible for 65 percent of the covid-19 misinformation on the platform. Yet it doesn’t provide much meaningful data on, well, covid-19 misinformation on the platform — opting to tout only the 18 million misleading posts it has removed without saying anything about how many misleading posts it hasn’t, and skirting entirely the question of who is seeing these posts where. Are lies spreading in clusters of pages and groups? What role does that pesky algorithm play? The New York Times reports that employees asked executives last year for more resources to measure precisely these prevalence statistics and that those resources were never approved.
There’s undoubtedly danger in letting people spew medical misinformation during a pandemic, especially when those people are repeat offenders. But there’s danger, too, in the government painting a private company as a criminal culprit. Wielding these exaggerations as weapons to goad firms into compliance risks giving cover to conservatives who seek to paint sites as unconstitutional censors, as well as to authoritarians abroad who actually seek to censor. By harnessing and sharing the data necessary to understand the spread of misinformation, both when it shows the platform succeeding and when it shows it failing, Facebook would do everyone — including itself — a favor.