“Bad for the world,” of course, is a comically subjective metric for assessing what it’s important that the billions of users of the world’s most popular social media platform see every day. Yet a similar anecdote from a New York Times story discusses a perhaps more promising yardstick dubbed “news ecosystem quality” that Facebook internally assigns to publishers to assess the authoritativeness of their journalism. These scores don’t usually hold much sway in determining where those publishers’ posts rank in people’s feeds — except after the 2020 election, when they did. Facebook had noticed a distressing surge in viral misinformation, so it awarded the scores more weight. Suddenly, mainstream outlets such as CNN and NPR were trouncing the typical hyperpartisan winners in the daily engagement wars.
Facebook could adjust its algorithm in this manner all the time. The company chooses not to — in part, the Times writes, because research also suggests that sensationalist content increases time spent on the site. Yet these rankings are even more important than the more attention-grabbing determinations about whether to slap a label on a false claim about election fraud from President Trump or whether to ban conspiracy theorist Alex Jones. Facebook and its peers have recently started to take responsibility for how they treat individual people, pages and posts. Taking responsibility for how they treat the flow of information in general is the next frontier.
Facebook could argue that by prioritizing engagement or time spent, it is only giving people more of what they enjoy, allowing them in effect to choose their experience. But that’s a twisted definition of choice when design decisions, whether the endless scroll or a “like” button, manipulate users into having the experience a platform plans. Facebook could also argue that it is, at core, a business — and certainly it is true that the site has the prerogative to serve its own commercial success. Yet these companies must ask themselves whether their only duty is to profit by keeping people’s eyes on their screens, or whether it is also to protect and even improve the societies in which they play so influential a role.
Whatever they answer, the rest of us ought to know about it. These networks have the capability to give us less polarizing versions of themselves, and they also have the capability to tell us whether or how they are doing it. They should do both — so that if Facebook determines it wants to be “good for the world,” we know that’s the path it is taking, and we also know what “good for the world” is supposed to mean.