CEO Mark Zuckerberg in a post cast the changes as a response to civil unrest and the potential for a chaotic Election Day in which final results could be delayed by a surge in mail-in voting amid the pandemic.
“The US elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned about the challenges people could face when voting,” Zuckerberg wrote in his post. “I’m also worried that with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country. This election is not going to be business as usual. We all have a responsibility to protect our democracy.”
But the moves are also the latest bid by the company to prevent the kind of disinformation that ran rampant on Facebook in 2016, as Russian operatives and others flooded the platform with posts designed to manipulate American voters. Zuckerberg’s announcement was an implicit acknowledgment that Facebook’s previous actions had not gone far enough to prevent a repeat this year.
The close attention paid to the new policies — and the political backlash from Trump and others — underscores the central role Facebook and other social media platforms are playing in the 2020 election, a significant portion of which is virtual because of the pandemic.
Thursday’s announcement included several policies civil rights groups and disinformation experts have repeatedly recommended — even ones the company was initially reluctant to embrace. However, Facebook stopped short of banning all political advertising in the days surrounding the election or fact-checking the claims of politicians, as some critics had advocated and as the company has done in countries where ad blackouts are mandated in the days before a major election.
Even so, some outside critics and employees question Facebook’s ability to enforce its new policies at a time disinformation is running rampant. Already, heavy demands on the company’s content moderation teams have overburdened them, despite massive expansions in recent years, according to employees who spoke on the condition of anonymity for fear of retribution. In recent months, the company has delayed its responses to users’ reports of problematic posts and has been slow to enforce policies it had announced.
“I’m glad that they’re making these changes. This is important,” said Rashad Robinson, executive director of civil rights group Color of Change and one of the organizers of a large advertiser boycott against the company. “I also think, with Facebook, it’s always about enforcement. Can they actually enforce these policies, and will they actually enforce them?”
Facebook’s preparations for the challenges it will face during the presidential election have recently intensified. Experts and civil rights advocates have argued that Election Day threatens to be a disaster and that Facebook, with its huge reach among American voters and more than 256 million users in the United States and Canada, has an important role to play in heading that off.
The intense political stakes for the company were immediately underscored as the Trump campaign blasted the company for its new policies and Democratic campaign committees followed soon after with their own critiques.
“In the last seven days of the most important election in our history, President Trump will be banned from defending himself on the largest platform in America,” said Samantha Zager, the campaign’s deputy national press secretary. “When millions of voters will be making their decisions, the President will be silenced by the Silicon Valley Mafia, who will at the same time allow corporate media to run their biased ads to swing voters in key states.”
Meanwhile, some Democratic officials joined Republicans in objecting to the changes, though for different reasons — primarily that it may hurt chances to encourage voter turnout.
Facebook already struggles in some cases to enforce its existing policies on misinformation and voting, as well as its ban on content that incites violence. Last month, the company announced it was banning armed groups that called for violence at protests. But just a few days later, amid protests over the police shooting of a Black man in Kenosha, Wis., Facebook failed to take down a local militant group’s page calling for citizens to take up arms to defend the city. Users had flagged the material as problematic hundreds of times, and Zuckerberg acknowledged the “operational mistake” in a video post last week.
Earlier this week, Facebook announced a broad effort and collaboration with academics to research the social network’s role in democratic election. One aspect of the project will involve paying research subjects to deactivate their accounts in the days before the election to create a control group, according to images viewed by The Washington Post.
One key sticking point in recent discussions between civil rights leaders and the company was whether Facebook’s policies would go beyond explicit efforts to suppress voting and tackle more subtle efforts, said Vanita Gupta, president of the Leadership Conference on Civil & Human Rights, a Washington-based group.
Zuckerberg’s post Thursday dealt with that distinction directly, saying Facebook was “now expanding this policy to include implicit misrepresentations about voting too, like ‘I hear anybody with a driver’s license gets a ballot this year’, because it might mislead you about what you need to do to get a ballot, even if that wouldn’t necessarily invalidate your vote by itself.”
Gupta called the new policy “a significant expansion” that should also flag the president’s posts unambiguously. “The test will come in how the company enforces this moving forward.”
Facebook’s new political ad ban sweeps up a broader set of posts classified by the social network as pertaining to politics, elections or social issues. Facebook says social issues are “sensitive topics that are heavily debated, may influence the outcome of an election or result in/relate to existing or proposed legislation.” Twitter last fall banned such ads outright.
Even as they clamor for Facebook to clamp down on misinformation, Democratic campaigns and activists have long argued that limits to the company’s ad technologies will end up hamstringing lesser-known and first-time candidates who have not built a robust audience.
Limits so close to the election, they also say, will directly affect get-out-the-vote efforts, which are seen as crucial to Democratic victories that traditionally depend on higher turnout.
“Facebook’s last-minute changes will not prevent disinformation from being shared organically and will still allow political campaigns to run ads with lies,” Sen. Catherine Cortez Masto (Nev.), chairwoman of the Senate Democrats’ fundraising arm, and Rep. Cheri Bustos (Ill.), who chairs the equivalent group for House Democrats, said in a joint statement Thursday. “At the same time, these changes will undermine efforts to ensure voters, particularly voters of color, who use Facebook as a resource can access accurate information — including when, where and how to cast their ballots.”
Ads placed in advance of the week-long blackout that encourage users to vote can remain active, and campaigns can boost or modify these posts through the election, including changing the audience to which they are targeted.
Facebook’s role in spreading political misinformation has been a hotly debated issue since the discovery that false news reports and a Russian operation to influence American voters flowed freely on the platform during the 2016 election. The controversy triggered political backlash and a scramble within the company to establish new policies and systems to avoid a repeat in 2020.
The issue of political advertising is particularly sensitive because of the central role Facebook played in Trump’s 2016 campaign, whose top officials repeatedly have boasted of their ability to use the company’s sophisticated advertising tools to gain advantage over Democratic opponent Hillary Clinton. It’s a view also widely shared within Facebook, whose officials have marveled at Trump’s effective use of their platform. He has 31 million followers there.
For this election cycle, Trump has spent $74.7 million on Facebook, significantly more than Democratic opponent Joe Biden’s $46.5 million, according to Facebook’s Ad Library. It shows a similar split in a recent seven-day span, with Trump spending $9.1 million compared with Biden’s $5.4 million between Aug. 26 and Sept. 1.
“This moment, more so than any others, Facebook knows that if they get it wrong, the company might be imperiled,” said Joan Donovan, director of the Technology and Social Change Research Project at Harvard’s Shorenstein Center. “The public is watching very carefully.”
Trump’s claims about mail-in voting and other election issues — widely disputed by election experts and independent fact-checkers — have been a particular source of concern for civil rights groups, which consider them an effort to suppress votes from Black people and other groups.
Zuckerberg‘s post echoed disinformation experts’ recent warnings that the sharply polarized political environment could result in more domestic disinformation.
Some experts praised the steps Facebook promised to take, particularly around potential issues with early declarations of victories.
“Especially welcome is the commitment to promote only official results and downgrade or remove any content that misleads the outcome of the election when there is bound to be confusion in the hours, days and possibly weeks after election night,” said Graham Brookie, director of the Atlantic Council’s Digital Forensic Research Lab, which tracks disinformation. “But keep in mind, misinformation is designed to seep through the cracks and game the rules.”