Throughout the hearing she used the term “engagement-based ranking” to synthesize the complexities of Facebook’s problems into a single, neutral term. The lawmakers tried saying it themselves. “We’ve learned … that Facebook conducts what’s called ‘engagement-based ranking’?” Senator John Thune asked tentatively.
He was correct. Facebook’s success as a business boils down to algorithms that bump the most titillating content to the top of users’ newsfeeds. These formulas are fundamental to Facebook’s success in engaging users but also contribute to the propagation of conspiracy theories(1)on the site and to drawing teenage girls to eating disorders on Instagram. In one powerful moment, Haugen pointed out that, years from now, women would effectively suffer from brittle bones and infertility because of Facebook’s choices.
As a witness, she exuded credibility, refusing to be drawn into personal attacks on Facebook CEO Mark Zuckerberg or thorny issues around free speech, admitting when she didn’t have an answer and using clear language.
All the more devastating for Facebook was how the cool-headed testimony started to sound like an intervention. Facebook had been hiding its problems, Haugen said, “and like people often do when they can hide their problems, they get in over their heads.” Congress needed to step in and say, “We can figure out how to fix these things together.”
Previous Facebook scandals have pulled lawmakers in different directions — squabbling with Zuckerberg, for example, over who’s truly being censored — and ultimately resulted in inaction. But their united support and understanding now marks a turning point.
So, here are four things Congress could do based on Haugen’s guidance:
Order Facebook to stop, or drastically reduce, engagement-based ranking algorithms.
Remove the nicotine that drives people back to Facebook and Instagram. Haugen’s alternative is “chronological ranking with a little bit of spam demotion.”(2) That means going back to what Facebook looked like in the early days, where newsfeeds were simply ordered by time. Algorithms could still go in to remove spam, though what that entails will be up to debate, but time and people — rather than machines — would be the ultimate curators of what people see. This will hit Facebook’s profits hard, and Zuckerberg may have resisted such a move because of his fiduciary obligation to shareholders. That’s why Congress needs to step in.
Order Facebook to spend more on content moderation.
Haugen says Facebook should not be broken up. That would starve safety teams across the empire of resources and the ability to work together. It would cut the problem into more, smaller problems. Instead, she suggests “human-scaled social media.” With Facebook’s AI often falling short in finding harmful content, humans already do much of the work spotting and stopping it. But Facebook keeps that work at arm’s length, outsourcing it to third-party vendors. One fix advocated by a recent study by New York University’s Stern School of Business was to double the number of Facebook’s content moderators to 30,000 and make many of them full-time staff members.
Establish an agency to audit Facebook’s algorithms and features.
Haugen called for a federal regulatory agency that could analyze Facebook’s internal “experiments” with software and share that information with its Oversight Board. The board already has a system in place to advise Facebook but has complained that Facebook is not forthcoming with the data needed to make decisions. Raw internal research — like the kind exposed in Haugen’s document dump — could lend greater weight to its directions (or orders from a new agency) to make Facebook’s sites healthier. For instance, it could order Facebook to elevate authoritative news sources, as it did after the November election; add a feature requiring users to click a link before sharing something; or make time-off prompts for the most addicted users.
Mandate regular disclosure for researchers.
Facebook should be required to release data on what’s happening on its site (with the right privacy protections) such as what posts are most shared or what political ads are being clicked on. Only then can academics outside the company analyze its systems and report on their findings.
None of these ideas are particularly new. And that will sting for the civil rights and privacy advocates whose suggestions until now have been met with silence and inaction from politicians. But their work has laid a critical foundation for Haugen’s testimony to finally gain momentum. Lawmakers will want to hear what they think of her ideas, and hopefully when they reach out to such groups for feedback they will hear some consensus.
(Corrects description of Frances Haugen’s former job in the headline readout. )
(1) QAnon has become such a problem that a California father earlier this summer took his two children to Mexico and killed them while under the influence of the movement’s ideas, according to federal authorities quoted in the Associated Presson Monday, which cited multiple incidents of child kidnappings linked to QAnon.
(2) Facebook experimented with chronological ranking years ago as part of so-called A/B testing of the site, to see what would get people to come back more. But after seeing positive revenue results for engagement-based ranking, the company never looked back. Facebook would hurt its own shares if it rolled back the system, which is why it needs to be compelled to do so.
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Parmy Olson is a Bloomberg Opinion columnist covering technology. She previously reported for the Wall Street Journal and Forbes and is the author of “We Are Anonymous.”
Tae Kim is a Bloomberg Opinion columnist covering technology. He previously covered technology for Barron’s, following an earlier career as an equity analyst.
More stories like this are available on bloomberg.com/opinion
©2021 Bloomberg L.P.