Pressure on Facebook to rethink its approach to political ads came from a wide array of federal regulators, digital experts and privacy advocates, as well as some of the company’s own employees. They argued that its policies coarsened American political debate and exposed users to serious risks, including viral disinformation, which malicious actors could pay to promote on the site.
But Facebook ultimately sided with President Trump’s reelection campaign and other political strategists, both Democrats and Republicans, who had fought fiercely behind the scenes to keep the digital tools that have helped them find new supporters, solicit donations and mobilize voters on Election Day.
The decision sparked a series of rebukes, however: Ellen L. Weintraub, who serves on the Federal Election Commission, sharply criticized the tech giant’s approach as “weak” and motivated by a desire to boost its profits.
Top Democratic presidential candidates, including Sen. Elizabeth Warren (Mass.), also pilloried the company, expressing concern that Facebook had essentially paved the way for Trump to lie to its users with impunity. And on Capitol Hill, Sen. Ron Wyden (D-Ore.) accused Facebook of trying to “fool people with fig leaves instead of taking real action.”
Under its new policies, Facebook said it would give users a choice to see fewer ads about political candidates and social issues, using a tool under development that it plans to roll out in the summer. Users can also select to stop seeing ads from particular campaigns and other entities, including businesses, that target them using custom lists of data, such as their email addresses. And the company announced it would provide more information in its public archive about the total number of people targeted in an ad campaign.
In a blog post announcing the changes, Rob Leathern, Facebook’s director of product management for ads, wrote that the company is “not deaf” to criticism about its rules around political ads. But he maintained that the changes would “increase the level of transparency it provides for people and [give] them more control over the ads they see.”
Computer scientists who have studied Facebook’s ad tools cast doubt on that conclusion.
“Of course giving users choice is a step in the right direction, but an opt-out option probably won’t make much of a difference because most users stick to the default,” said Piotr Sapiezynski, a research scientist at Northeastern University.
Thursday’s announcement marked the latest instance when Facebook — under fire for a wide range of its business practices — sought to wait out a controversy before ultimately offering only modest changes that failed to satisfy skeptical government officials.
On Tuesday, for example, the company issued a new policy on video manipulation, or deepfakes, that still allows an infamous altered clip of House Speaker Nancy Pelosi appearing drunk to remain on the site. The move came months after the clip went viral, sparking bipartisan scorn.
With its affirmation that it will not curtail targeting, Facebook set itself apart from Google and Twitter, each of which introduced major reforms last year in response to a prolonged outcry over the capacity to narrowly tailor messages to voters on social media. Trump’s reelection campaign helped catalyze the changes, after his team ran false ads about 2020 Democratic hopeful Joe Biden. Facebook and Google refused to take down the ads about the former vice president, sparking a widespread outcry.
Twitter’s approach was bluntest, banning all advertisements about candidates, elections and political issues such as abortion and immigration. The ability to reach voters online should be “earned, not bought,” the company’s chief executive, Jack Dorsey, said in announcing the move, which drew sharp rebukes from the Trump camp.
Google opted to preserve political advertising, including certain targeting capabilities, but the company limited some of the most precise tools for reaching specific users, prompting bipartisan backlash from political outfits that have grown accustomed to such powerful technologies. In doing so, it also preserved its own policy against fact-checking political ads on its services, including YouTube.
Facebook, however, said it sought to take a different approach. “While Twitter has chosen to block political ads and Google has chosen to limit the targeting of political ads, we are choosing to expand transparency and give more controls to people when it comes to political ads,” Leathern wrote.
Generally, Facebook’s advertising tools allow tailoring messages to lists of individual voters or to small groups based on characteristics such as age, education, Zip code, income, relationship status, interests or political leanings. Most powerfully, Facebook also allows the creation of “custom audiences” based on lists of individuals who, for example, donate to a cause or visit a web page that Facebook tracks. The result can be a torrent of different messages to different voters; Trump’s 2016 campaign, for instance, used tens of thousands of different ads each day.
Internal discussions about limiting political microtargeting began in 2017, as Facebook reeled from revelations about how Russian operatives used the platform to manipulate U.S. voters in the presidential election a year earlier. The idea did not immediately catch on, but interest surged last year as critics sought reforms at Facebook ahead of the 2020 presidential election.
In considering reforms, Facebook chief executive Mark Zuckerberg remained steadfast in his view that his company should not serve as an “arbiter of truth,” vetting what politicians can say. Instead, Facebook had considered a wide menu of other changes — including limiting the size of an audience that could be targeted with an ad and labeling paid political media to indicate it had not been fact-checked, The Post first reported.
But Facebook was pressured last fall by the Trump campaign not to restrict advertising opportunities. Gary Coby, the campaign’s digital director, argued that reining in targeting would be “dangerous” and a “huge blow to speech.”
On Thursday, Trump campaign spokesman Tim Murtaugh described Facebook’s reforms as “much better” than what Google and Twitter had done, saying the lack of limits on Facebook “encourages more Americans to be involved in the process.”
He insisted “our ads are always accurate” despite evidence to the contrary. The president’s campaign ran online ads in 2019 that included inaccuracies about Biden and his ties to Ukraine, according to fact-checkers, a move that triggered the debate over falsehoods on Facebook in the first place. Additional falsehoods promoted by the campaign have touched on issues such as the Russia investigation and immigration.
Democrats also warned Facebook about instituting major changes: They said they relied on targeting tools to raise money and mobilize supporters — critical in matching Trump’s enormous audience on Twitter.
Many in the party, however, urged Facebook to fact-check political figures, fearing their foes might pay to spread falsehoods on the social networking site. On Thursday, Priorities USA, a leading Democratic super PAC, said the changes came up short.
“These changes read to us mostly as a cover for not making the change that is most vital: ensuring politicians are not allowed to use Facebook as a tool to lie to and manipulate voters,” said Madeline Kriger, the group’s integrated media director, who oversees in-house digital ads.
The Democratic Party’s 2020 contenders echoed that view: Warren, who once intentionally ran a Facebook ad with a falsehood to call attention to the issue, said the company should be held accountable so that “democracy isn’t held hostage to their desire to make money.” A spokesman for Biden, Bill Russo, said it served as “window dressing around their decision to allow paid misinformation.”
Facebook employees had called on the company in October to adopt sweeping changes to its ad rules. Researchers, meanwhile, warned about the dangers of ads focused on narrow communities of users.
“Microtargeting is what’s driving privacy abuses because it’s furthering the desire to grab as much information about people as possible from all possible sources,” said Aleksandra Korolova, a computer scientist at the University of Southern California. “It doesn’t serve the individual’s interest.”