We all know what kinds of posts we see when we open Facebook. But what is everyone else seeing in their personalized feeds? And just how much of it is divisive, misleading, or outright false?

Those questions have never had a definitive answer, partly because Facebook keeps secret much of the relevant data. Analytics tools such as Newswhip, which is independent, and CrowdTangle, which Facebook owns, provided windows into what’s trending on the social network. And a Twitter account called Facebook’s Top 10, run by New York Times technology columnist Kevin Roose, drew on CrowdTangle’s data to produce weekly lists of top-performing U.S. Facebook pages — many of which turned out to be conservative or even right-wing political personalities. Meanwhile, Facebook has endured harsh criticism from President Biden and other officials who view it as teeming with conspiracies and misinformation.

Facebook has long argued such “top 10” lists present a skewed view of its platform, making conservative commentators such as Ben Shapiro, Dan Bongino, and Franklin Graham seem more popular than they really are. But it struggled to back up its claims without offering more data of its own.

On Wednesday, the social giant announced that it will begin publishing a quarterly report of its own, called the “Widely Viewed Content Report,” that slices its data along new lines to produce a very different set of rankings. Instead of presenting Facebook as a hotbed of right-leaning politics, the company’s inaugural report presents a far weirder, messier, and spammier picture: the news feed as a junk-mail folder.

It shows, for instance, that the most-viewed link on Facebook in a recent three-month period was to the website of a Wisconsin firm that offers to connect Green Bay Packers fans to former players. The second-most-viewed link was to the online storefront of Pure Hemp, which sells CBD products. The third-most-viewed post in the same period, by Facebook’s reckoning, no longer exists at all. A preview of the post that appears in search engines suggests it was a viral meme encouraging users to disclose personal information to discover their “porn name.”

The new report comes as part of a broader push by Facebook to block or discredit independent research about harmful content on its platform, offering its own carefully selected data and statistics instead.

“It’s like ExxonMobil releasing their own study on climate change,” said a former Facebook employee, who spoke on the condition of anonymity due to a nondisparagement clause. “It’s something to counter the independent research and media coverage that tells a different story.”

Also on Wednesday, Facebook published a blog post rebutting an influential recent report by a nonprofit that found a dozen “superspreaders” of false claims — the so-called “Disinformation Dozen” — were responsible for some 73 percent of misinformation about the coronavirus vaccines on Facebook. “There isn’t any evidence for this claim,” wrote Monika Bickert, Facebook’s vice president of content policy, of the finding by the Center for Countering Digital Hate. She offered in its place a different statistic, asserting that those 12 people are responsible for just 0.05 percent of all views of vaccine-related content on Facebook. Facebook itself didn’t offer any evidence to back up that figure.

Earlier this month, Facebook shut down the accounts of NYU researchers who had been studying how misinformation spreads through political ads on its platform. The company cited privacy concerns, an explanation that drew a rare public rebuke from a Federal Trade Commission official.

Roose’s “Facebook’s Top 10” lists of popular Facebook content focused on posts from public Facebook pages that included links, which Roose said was a way to zero in on content that is more likely to be newsy and political, and thus of public interest. Facebook’s new report repeatedly emphasizes that such content represents a tiny fraction of everything users see in their feeds. For instance, it noted that less than 13 percent of all views of content in the news feed in the second quarter were on posts that included links.

For its own report, Facebook opted to share four lists that included eclectic mixes of domains and content, with viral animal memes, cooking pages, and sites hawking Christian merchandise crowding out big-name media publishers. Notably, its rankings were drawn from the three-month period between April 1 and June 30, 2021, so they offer no insight into what was popular in the months preceding or immediately following the 2020 election or the Jan. 6 Capitol riot. They also don’t show what has been trending recently.

Asked if he knew why an obscure Green Bay Packers alumni website was the most popular link in Facebook’s data set, spokesperson Ryan Peters replied, “When content from lesser known creators goes viral it isn’t necessarily a bad thing. It shows that anyone, not just established superstars, can reach a wide audience on the platform so long as their content is compelling.” Peters added that the decision not to focus the report on news or political content was intentional, reflecting the fact that most people’s feeds are dominated by posts from their own friends and family, not news outlets or ideologues.

Under other circumstances, researchers might welcome new data such as the lists Facebook shared in its new Widely Viewed Content report. But the immediate reception from some academics in the space was ambivalent.

Brendan Nyhan, a Dartmouth political science professor who researches online political and health information, called the report “an important corrective to analyses that focus on the highest-engagement public page content on the platform,” because public page content is only a small part of what most people see on Facebook. But he also called on Facebook to provide “greater transparency,” noting that a single quarterly snapshot is a poor substitute for tools such as CrowdTangle that allow researchers, journalists, and the public to perform their own queries.

Jennifer Grygiel, an associate professor of communication at Syracuse’s Newhouse School who studies social media, was unimpressed by Facebook’s initial report. “This is yet more PR,” they said. “They’re trying to control the narrative around CrowdTangle reporting” and counter the work of the Times’ Roose and others with their own handpicked metrics.

Roose reported in July that Facebook has moved to rein in CrowdTangle, which had operated with a degree of autonomy within the company since its 2016 acquisition, because executives were upset at how some critics were interpreting the data. Asked by The Post on Wednesday whether it will continue to publish CrowdTangle data, the company did not give a direct answer, but it offered a statement from Guy Rosen, its vice president for integrity.

“This is another step on a long journey we’ve undertaken to be, by far, the most transparent platform on the Internet,” Rosen said of the new reports. “We’re continuing to think through additional transparency features we can offer.”

Drew Harwell contributed to this report.