WARSAW — Most political parties in Poland have complaints about Facebook’s algorithms, the obscure formulas that decide which posts pop up on a user’s news feed and which fade into the ether.
That Facebook might be amplifying outrage — while driving polarization and elevating more-extreme parties around the world — has been ruminated on inside the company for years, according to the internal documents known as the Facebook Papers, which were disclosed by the whistleblower Frances Haugen to the Securities and Exchange Commission. Redacted versions were reviewed by a consortium of news organizations including The Washington Post.
In one April 2019 document detailing a research trip to the European Union, a Facebook team reported feedback from European politicians that an algorithm change the previous year — billed by Facebook chief executive Mark Zuckerberg as an effort to foster more “meaningful” interactions on the platform — had changed politics “for the worse.”
The Facebook team reported back specific concerns from Poland, where political parties had described a “social civil war” online. Such worries have prompted reviews by regulators and lawmakers across the continent, including proposals at the European Parliament to force more transparency from Silicon Valley’s tech giants.
The Eastern European country, led by the populist Law and Justice party since 2015, is bitterly divided between ardent supporters of the government and equally committed critics. Battle lines are drawn over such issues as abortion, LGBT rights and a fight with the E.U. over the primacy of the laws that bind the 27-nation bloc.
In Warsaw, the two major parties — Law and Justice and the opposition Civic Platform — accused social media of deepening the country’s political polarization, describing the situation as “unsustainable,” the Facebook report said.
“Across multiple European countries, major mainstream parties complained about the structural incentive to engage in attack politics,” the report said. “They see a clear link between this and the outsize influence of radical parties on the platform.”
An independent data analysis of major political parties in Poland that was conducted for The Post showed that after 2018, negative messages were more likely to receive a high number of shares. Previously, it appeared that more of a mix of positive and negative posts did well.
Some Facebook employees recognized the need to act, according to the documents, but it was not just out of concern over the potentially damaging impact on society, internal documents show. Some employees also felt revisions to its algorithms were best for long-term growth, likening such outrage-centric content to junk food.
“We can choose to be idle and keep feeding users fast-food, but that only works for so long,” the internal report said. “Many have already caught on to the fact that fast-food is linked to obesity and, therefore, its short-term value is not worth the long-term cost.”
For more than a decade, content has been ranked using a complex formula that assesses at least 10,000 data points every time it decides what to show. In 2018, Facebook made a big change to that formula to promote “meaningful social interactions.” These changes were billed as a design to make the news feed more focused on posts from family and friends and less from brands, businesses and the media. The process weighted the probability that a post would produce an interaction, such as a like, emoji or comment, more heavily than other factors.
But that appeared to backfire. Haugen, who this week took her campaign against her former employer to Europe, voiced a concern that Facebook’s algorithm amplifies the extreme.
“Anger and hate is the easiest way to grow on Facebook,” she told British lawmakers Monday. She will appear in Brussels early next month.
“Facebook wants blood,” said Anna Sikora, who handles social media for the left-wing Razem party, and who did not meet the Facebook team during its trip. “If we quote a stupid statement of our political opponent, our post reaches lots of people.”
Razem’s social media team tried to share more posts in groups or use short videos to get a wider reach. But that has had limited impact. “If there’s no blood, it is likely to only be seen by our social bubble,” she said. “Even most of our voters will never see it.”
In Europe, the details of the internal Facebook report — parts of which were first reported by the Wall Street Journal — have opened debate on the extent to which Facebook has fanned and shaped a trend in some countries toward more-toxic, deeply polarized politics.
Social media’s permeation of politics in Europe has occurred alongside epochal events that cleaved fresh divisions. Most notable was the 2015 migrant crisis — a huge wave of refugees and others from war-torn Syria and elsewhere — which helped to boost far-right parties and their nativist rhetoric across the continent.
When it comes to polarization, academic research does not support “the idea that Facebook, or social media more generally, is the primary cause of polarization,” said Facebook spokeswoman Dani Lever. “If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t.”
“Is a ranking change the source of the world’s divisions? No,” she said. “We’re continuing to make changes consistent with this goal, like new tests to reduce political content on Facebook based on research and feedback.”
Facebook has said its systems are not designed to push provocative material. “The argument that we deliberately push content that makes people angry for profit is deeply illogical,” Zuckerberg said earlier this month. 'We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content.”
Several fixes were floated within Facebook as part of the report on the impact of its algorithms on European politics — alongside similar complaints from media organizations — including rethinking “current incentive structures.”
Several employees suggest adjusting the weighting given to the angry emoji within the algorithms to make it less likely that negative posts break through. Angry emoji, like other emotional reactions, initially had five times the weighting of a like, the documents show. It was gradually downgraded, and by Oct. 1, 2020, the weight of an angry reaction had been reduced to zero.
So far, the Confederation hasn’t noticed much of an impact. “This algorithm is not the worst thing,” said Grabarczyk, a 27-year-old with neatly gelled hair, a suit, crisp white shirt and gold cuff links that look out of place in the party’s somewhat ramshackle office in central Warsaw.
With 664,000 followers, the party — Konfederacja in Polish and initially formed as a coalition of two parties — has the biggest presence of all the country’s political parties, despite having just 11 seats in the 560-seat Parliament. Grabarczyk said the party tiptoes along the “border” where it could run afoul of hate-speech moderators.
“Where is the wall, which as #Confederation we have been demanding for over 2 months?!” asks one of Confederation’s posts with a video of people trying to pull down barbed wiring; the video garnered 1,900 likes and reactions and more than 1,000 comments. Another video, with more than 4,900 likes and reactions, features a doctor raising questions about the vaccination of children: “No to propaganda and intimidation!” it concludes. “Pass it on!!” ️
Facebook says it knows people try to “game our systems” to avoid enforcement. “We work to identify and remove content that breaks our rules and prohibit altogether organizations that continuously violate our community standards, and have removed accounts connected to different political parties and social movements, including politicians associated with Konfederacja for violating our policies in the past,” said Lever.
Janusz Korwin-Mikke, one of the party’s politicians who was at the time the most popular Polish politician on Facebook, was banned from the platform last year.
It was the presidential election in 2015 that woke Polish politics to the powers of Facebook, said Pawel Rybicki, who worked on the campaign for President Andrzej Duda. “We used social media full-scale,” Rybicki said. Duda, an ally of Law and Justice, had been considered the underdog but won with 51.5 percent of the vote.
“It was like a war, and social media was the new gun for Polish political parties,” recalled Rybicki, who met with the Facebook team when it was in Warsaw but said he largely raised concerns regarding moderation.
A consultant to the social media team for the Civic Platform party, who spoke on the condition of anonymity to discuss that party’s social media strategy, described those days as the “wild West,” with apparently little content-moderation on Facebook. He said that he, like others, noticed a shift in 2018 with more-extreme content breaking through.
Facebook says it has had teams reviewing content in Polish prior to 2015 and currently has 40,000 people working on safety and security.
“The walls of the bubbles are thicker and thicker,” he said, referring to the echo chambers that different parts of society occupy on social media. “One bubble, that mainly delivers anti-democratic statements, [is] much more shown by Facebook.”
Internal messages show that such questions are continually dissected by Facebook employees, according to the documents. In one, they dug into a 2021 study’s finding that extreme parties tended to elicit relatively greater emotional responses on Facebook such as anger and love.
“I am not comfortable making judgments about some parties being less good for society and less worthy of distribution based on where they fall on the ideological spectrum,” responded one commenter inside the company. “All the parties in the analysis are legal.”
“I don’t think we should be judging political parties directly either,” another said. “But I do think we should be questioning what kinds of messages are being amplified by virtue of our algorithmic choices.”
The challenge for European regulators remains how to make sure Facebook and other social media platforms do not amplify far-right voices and further tear at the political center.
“It’s difficult to legislate,” said Alexandra Geese, a German lawmaker in the European Parliament involved in stewarding proposals to force greater openness by Facebook and other social media companies.
The key to any effort will be transparency regarding algorithms, she said. She called it “startling” that Facebook had never publicly revealed it had received feedback from European politicians on the negative impact of its algorithms.
“Many areas of our political debate seem broken because of division,” said Damian Collins, the head of the British parliamentary committee charged with drafting online safety legislation. “If that division is being driven by social media platforms in the way that they are being designed, that’s something that I think we have the right to know about, because that’s a direct attack on our democracy.”
A lack of data complicates any outside analysis of what is working and what is not on Facebook, said Dominik Batorski, a founder of Sotrender, a social media analytics firm, who met with the team from Facebook when it visited Warsaw.
The company’s analysis of public data on posts — such as reactions and shares — showed that negative posts appeared to break through more after 2018. Yet not everything in the findings reflected the complaints that Polish politicians voiced to Facebook.
According to Facebook’s internal report, the social media management team of one Polish political party (which was not named) described its shift: moving from a roughly even split of negative and positive messages, to 80 percent negative. The Civic Platform’s social media team did not say whether the Facebook report mirrored feedback from someone at the party. Law and Justice officials declined interview requests.
The Civic Platform appeared to shift to posts with more-negative sentiment in 2015, when the party lost elections and before the algorithmic changes, Batorski’s analysis showed. And in contrast, Law and Justice skewed more to the positive on its main Facebook page after 2018, Batorski said.
Batorski’s analysis took into account only official pages for the parties; much of their online war takes place on platforms not affiliated with the parties. Rhetoric attacking LGBT “ideology” during Law and Justice’s recent campaigns, for example, was not expressed on its main account, Batorski said.
Another analysis of Polish parties’ Facebook posts in one week last year, the 14 most popular were from the Confederation.
“Ranking top page posts by reactions and comment doesn’t paint a full or accurate picture of what people actually see on Facebook, because engagement doesn’t not equate or predict reach,” said Facebook’s Lever, referring to the number of people who actually get to see a post — data which is not public.
Batorski said Facebook had appeared proactive in gathering feedback ahead of European Parliament elections in 2019.
But he said Facebook focused on transparency regarding political advertisements, which is not as much of a concern in Europe because ad spending is generally low. He said he raised concerns over such issues as Facebook pages built solely to push negative campaigning.
The Confederation has been helped online by a galvanized and much younger support base than most other parties, giving it an outsize presence on social media. Only formed in late 2018, it won 7 percent of the vote in elections a year later. It had not expected to enter Parliament, Grabarczyk said.
“We did everything on the Internet, everything,” he said. “I’d say it’s 70 percent thanks to Facebook.”
Dariusz Kalan contributed to this report.
The Facebook Papers are a set of internal documents that were provided to Congress in redacted form by Frances Haugen’s legal counsel. The redacted versions were reviewed by a consortium of news organizations, including The Washington Post.
The trove of documents show how Facebook CEO Mark Zuckerberg has, at times, contradicted, downplayed or failed to disclose company findings on the impact of its products and platforms.
The documents also provided new details of the social media platform’s role in fomenting the storming of the U.S. Capitol. An investigation by ProPublica and The Washington Post found that Facebook groups swelled with at least 650,000 posts attacking the legitimacy of Joe Biden’s victory between Election Day and Jan. 6.
Facebook engineers gave extra value to emoji reactions, including ‘angry,’ pushing more emotional and provocative content into users’ news feeds.
Read more from The Post’s investigation:
Key takeaways from the Facebook Papers
Frances Haugen took thousands of Facebook documents. This is how she did it.
How Facebook neglected the rest of the world, fueling hate speech and violence in India