In November 2018, the staff of Facebook’s fledgling Civic Integrity department got a look at some eye-opening internal research — presented under an image of two goats locking horns.

The report examined articles shared on Facebook from the New York Times, BuzzFeed, Fox News and a dozen other media outlets and found that the more negative-slanting comments a story drew, the more likely Facebook’s algorithms were to promote it widely.

“Outrage gets attention,” surmised the researchers. They ruefully compared the strategy to “feeding users fast food,” an irresistibly effective tactic for hooking an audience that would surely prove harmful down the road.

“Definitely not the results we want,” bemoaned one Civic Integrity team member in a comment posted under the report. Another wrote that “this is why I answer ‘neutral,’ ” when asked on workplace surveys if Facebook was a force for good in the world. “I really hope we can change the incentives.”

Yet, even as Facebook staffers agonized over whether the company was nudging news organizations to produce “darker, more divisive content,” according to communications contained within just-released internal documents, the company had already made changes that would further boost some of the most extreme ideological websites over moderate and neutral news sources.

The seismic impact of Facebook’s algorithms on the news industry — which, struggling for a digital business model, had come to rely heavily on Facebook for readers — has been widely reported. But while this new cache of documents does not capture the totality of the decision-making process within Facebook, it offers a vivid glimpse inside the company, including how some crucial algorithmic changes were studied and viewed within certain corners of the social media behemoth. The documents also confirm previous reporting that some employees had been sounding alarm bells for years over its practices in favoring right-wing publishers and were dismayed that the company did not or could not do more to tamp down on misinformation and divisiveness on its platform.

Redacted versions of these documents were shared with the U.S. Securities and Exchange Commission and Congress by lawyers for whistleblower Frances Haugen and were reviewed by a consortium of news organizations, including The Washington Post.

A Facebook spokesman said the company has made changes to address troubling content on its site since the time concerns were first raised internally.

In the company’s earnings call on Monday, Facebook founder Mark Zuckerberg reiterated the company talking point that Facebook could not be held solely responsible for political divisions in the country, nor the state of the media business. “Polarization started rising in the U.S. before I was born,” he added, and Facebook “can’t change the underlying media dynamics.”

For years, Facebook has been attempting to grapple with the vitriolic political messaging that flooded its users’ screens in the lead-up to the 2016 election and its aftermath, prompting fierce criticism that Facebook had allowed political operators to manipulate its platform, such as Cambridge Analytica, a Trump-affiliated consultancy that abused tens of millions of Facebook profiles.

To try to turn the tide, Zuckerberg began 2018 by announcing that Facebook would alter its algorithm to encourage what he called “meaningful social interactions” and to reduce the prominence in users’ feeds of posts that came from “businesses, brands, and media.” Zuckerberg later acknowledged that the overall number of news stories would be reduced but said Facebook would “prioritize news that is trustworthy, informative, and local.” What counted as trustworthy, he said, would be determined by asking users in quality surveys.

Facebook spokesman Andy Stone said the company took steps to make publishers aware of what the changes would mean for them, with briefings for individual news organizations and industry associations. But the effect of the algorithm changes quickly became apparent at media sites that had come to rely on social media to find an audience. Traffic plummeted at many news organizations: Mother Jones reported a 37 percent drop in its Facebook referrals. Slate noted that its referrals crashed by 81 percent. Vox laid off 50 employees, citing the impact of lost Facebook traffic. Some smaller sites, noting the effect of the algorithmic shift, shut down completely.

The newly released internal documents reveal that one publisher articulated the crisis directly to Facebook. BuzzFeed had racked up early success after its 2006 launch thanks to its stories going viral on Facebook; eventually, though, it branched out from meme-ified humor and cute cat videos into serious journalism, winning a Pulitzer this year. Yet according to an undated internal Facebook report, CEO Jonah Peretti warned Facebook that the algorithmic changes that were intended to boost “meaningful” interactions were having the opposite effect.

Peretti complained that the more significant stories his team created had far less success and far less promotion on Facebook than “fad/junky science,” “extremely disturbing news,” “gross” images and content exploiting racial divisions. (Some elements of Peretti’s complaints were previously reported in a story last month by the Wall Street Journal, which received many of the new Facebook documents first. BuzzFeed did not comment for this story or the Journal’s.)

In one document, from 2019, Facebook’s Civic Integrity team warned that “we are affecting media ecosystems by creating perverse incentives.” By way of illustration, the report provided a chart showing how smaller, opinion-heavy sites such as the Daily Wire, Breitbart and the Western Journal had a higher ratio of story clicks per employee than large news organizations including the New York Times, The Washington Post and USA Today.

“Fewer people are incentivized to produce original in-depth content and invest in journalism,” researchers noted. Instead, they wrote, “publishers hire specialists in repackaging other outlets’ reporting with more social media friendly and often divisive headlines to gain distribution, clicks and revenue.”

Asked about the revelations about Facebook’s impact on the media industry, a spokesman noted that the company has invested $175 million in local news, via conferences, grants and sponsorships, since 2019. He also pointed to past statements Facebook has made about prioritizing original reporting and news on the platform.

Some conservative publishers argued that they, too, were unfairly hurt by Facebook’s altered approach to news; Tucker Carlson called the algorithm shift “an act of ideological warfare.” Facebook had been a target of conservative ire since the spring of 2016, when a former employee alleged that the site’s “trending” news section downplayed stories that appealed to conservatives, prompting congressional inquiries and an extensive internal investigation.

And yet, some conservative news outlets seemed to fare well after Zuckerberg’s algorithm shift. Politico reported in March 2018 that both Fox News and National Review saw their interaction rates increase.

Today, by some outward indicators, conservative sites with a vehement ideological bent appear to be flourishing on Facebook. A daily roundup using data from Facebook’s CrowdTangle shows right-wing and conservatives personalities such as Ben Shapiro and Dan Bongino consistently garnering the most engagement. (Facebook has long pushed back on the importance of top 10 lists, saying they offer a skewed view on popularity.) An August article from Breitbart, an early and loyal media ally of former president Donald Trump, touted three months of CrowdTangle data to boast that it was “demolishing its establishment foes on Facebook.”

An investigation last year by the Wall Street Journal, and similar reporting by Mother Jones, offered one potential explanation: Facebook policy executives had been so fearful that Facebook’s 2018 attempt to de-emphasize political news would have a disproportionate effect on conservative publishers that its engineers tried to compensate for it — by making algorithmic tweaks to reduce the visibility of liberal news sites, including Mother Jones. (Facebook denied that it made changes that targeted specific publishers.). The Washington Post also previously reported that Facebook had tweaked its newsfeed algorithm in order to protect conservative publishers out of fear of accusations of political bias.

Now, the new cache of internal Facebook documents provides more insight into that dynamic. Some memos include assertions by Facebook staffers that when conservative publishers engaged in behavior that ran afoul of Facebook’s rules, the company often let them off the hook.

“A fear of political backlash was a contributing factor” in decisions made to not have conservative publishers Breitbart and PragerU or conservative personalities Charlie Kirk and Diamond and Silk deemed “repeat offenders” for promoting misinformation — a designation that is supposed to cause a temporary block on ads — one Facebook staffer stated in an internal post from 2020. Another conservative outlet, the Daily Wire, the staffer wrote, seemed “to have been consistently exempted from punishment” for running afoul of Facebook’s rules against collaboration with other groups to echo and amplify falsehoods. The warning echoed a previous callout, first reported by Buzzfeed, of the same issues in August by an engineer who was subsequently fired after doing so.

The documents did not outline what specific violations had allegedly occurred. Referring to Diamond and Silk, two passionately pro-Trump video bloggers, the Facebook document noted the duo “is extremely sensitive and has not hesitated going public about their concerns around alleged conservative bias on Facebook.” A 2020 NBC News story reported that Facebook managers intervened to remove “strikes” from their internal records for Diamond and Silk, allowing them to avoid repeat-offender status, a move employees believed reflected preferential treatment. (NBC reported about two-thirds of such “escalations” — cases flagged for senior management to consider — were for conservative pages.). The Washington Post reported that ahead of the 2020 election, Facebook removed misinformation strikes for former president Donald Trump’s son Donald Trump Jr.

Stone, the Facebook spokesman, said that while the company defers to third-party fact-checkers to rate posts, Facebook takes responsibility for “how we manage our internal systems for repeat offenders," including whether a specific negative rating warrants any consequences.

John Bickley, editor in chief of the Daily Wire, disputed the notion that Facebook had boosted its audience, arguing that “Facebook has often pursued policies directly at odds with our business,” and that “Americans seek out our content because, as the polls show, legacy media is widely mistrusted, and for good reason.” PragerU executive Craig Strazzeri called claims that Facebook aided conservatives “absurd,” adding that “if they were really trying to help us, they are doing a terrible job.” A spokesman for Kirk maintained that prominent conservatives are “routinely” targeted by Facebook and unfairly labeled as misinformation: “Any ‘exceptions’ must have been made only after gross bias had already been demonstrated,” said Andrew Kolvet.

Concerns about a conservative backlash also emerged in discussions in 2019 over whether to drop two tools that Facebook’s engineers used to rein in “hyperposter” accounts that share a large number of posts frequently as well as the spread of articles shared by people who hadn’t actually read them, according to the internal documents. The company’s studies had indicated that eliminating those tools would drive more Web traffic toward politically extreme sites, especially far-right publishers such as Breitbart — and that surge, some Facebook staffers feared, could lead to accusations from those publishers that their traffic had previously been suppressed.

“We could face significant political backlash for having ‘experimented’ with distribution at the expense of conservatives publishers,” an internal analysis warned. The documents show Facebook considered removing the measure to restrain hyperposters after the 2020 elections — a move some staffers lamented as it would boost traffic to extreme websites. But that measure remains in place; the other control was removed, according to Facebook.

The internal anxiety expressed in the new documents represents Facebook’s perpetual discomfort “in deciding which bit of power to align with,” said Emily Bell, director of the Tow Center for Digital Journalism at Columbia Journalism School. (Last week, The Post reported that another whistleblower recounted hearing a top Facebook executive argue in favor of exempting Breitbart and other publishers from the typical rules against spreading false reports, saying “Do you want to start a fight with Steve Bannon?” — the top Trump adviser and former Breitbart editor. The executive denied giving preferential treatment to any publishers.)

“They nominally want to align with the quality press,” Bell argued, “but Trump and his significant presence and influence on the platform just proved too powerful.”