The Washington PostDemocracy Dies in Darkness

The education of Frances Haugen: How the Facebook whistleblower learned to use data as a weapon from years in tech

Haugen’s calls to change Facebook’s incentives have broken through because of careful planning and enthusiasm for deep research, according to interviews with her and with those who know her

Speaking before a Senate subcommittee on Oct. 5, the former Facebook employee told l company knows its products can cause harm. (Video: Reuters, Photo: Matt McClain/Reuters)

Over the summer, Frances Haugen called a friend to talk through the implications of her decision to reach out to a journalist and provide confidential documents she had gathered from inside Facebook.

Haugen, a veteran of Silicon Valley giants including Pinterest, Google and Yelp, told her friend Leslie Fine, an entrepreneur and start-up adviser, that in her two years of working as a product manager at Facebook, she’d observed a troubling pattern of behavior that seemed unique to the company. Haugen was scared, she told Fine, and particularly worried that her revelations would destroy her career and invite blowback against her parents, friends and former co-workers.

Fine, an economist who was trained in game theory, assured Haugen that she was being thorough and analytical in her reasoning; it seemed she’d thought through all of the possible outcomes and was willing to live with them.

By September, Haugen had made the decision not just to leak documents but to go public as a whistleblower. “I just don’t want to agonize over what I didn’t do for the rest of my life. Compared to that, anything else doesn’t seem that bad,” Haugen wrote in a text.

“You just defined bravery,” Fine responded.

Haugen, 37, revealed herself this month as the Facebook whistleblower who shared documents with the Wall Street Journal, the Securities and Exchange Commission and Congress that she says show the company repeatedly made decisions to incentivize profits and growth over its users’ well-being. Haugen testified publicly Tuesday in front of a Senate subcommittee, alleging that the social media giant bears some responsibility for a wide range of societal ills.

Facebook whistleblower Frances Haugen tells lawmakers that meaningful reform is necessary ‘for our common good’

Whistleblowing is often a messy business, fraught with murky motives and sullied by aggressive corporate campaigns to discredit those who come forward. Despite her worst fears, Haugen appears so far to be an exception: Her calls to change Facebook have broken through, winning bipartisan support on Capitol Hill, bringing renewed calls for regulation and prompting soul-searching among the world’s social media users.

Facebook has painted her as a low-level employee speaking about subjects on which she lacks direct knowledge. So far, however, Haugen has withstood that challenge through careful planning and deep research, according to interviews with her and those who know her. A believer in the power of data to tell a story, Haugen saw an opportunity to turn Facebook’s biggest weapon — its ability to collect and measure the human experience — against it.

The former Facebook employee alleged on that the company has misled the public about internal research showing that some of its products have harmful effects. (Video: Reuters)

Facebook has an internal saying that “data wins arguments,” which Haugen appears to have taken to heart when she left the company with tens of thousands of pages of its internal research and communications.

Facebook declined to comment on Haugen’s history at the company, but spokeswoman Lena Pietsch last week noted that Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question.”

“We don’t agree with her characterization of the many issues she testified about,” Pietsch said.

Haugen has also faced criticism from some conservative media, who have accused her of trying to limit speech online, as well as antitrust advocates who believe the company is too big. She’s largely advocated for content-agnostic solutions that would add greater friction to spreading misinformation online. She also testified that she doesn’t think that Facebook should be broken up because it wouldn’t address the algorithmic problems with its services, and it could potentially result in fewer advertising dollars for Facebook to invest in safety around the world.

Haugen also is steeped in the culture of Silicon Valley, familiar with its stagecraft and proficient in its methods, according to more than a half-dozen interviews with her friends, former colleagues and professors. And her role on Facebook’s civic integrity team, working on misinformation and counterespionage, uniquely positioned her to have insight into some of the company’s most significant challenges.

Haugen joins a long list of tech industry whistleblowers who have gone public with the poor practices they’ve uncovered in Silicon Valley — including many from Facebook. But Haugen stands out for her methodical planning and the rigor with which she has built her case. She’s used Facebook’s own internal research and communications to show how the company directed resources away from key safety programs. She drew on the files and her own experiences to paint a picture of a company where a prioritization of growth metrics fueled decision-making.

She has assembled a team of experts to back her, including lawyers from the nonprofit organization Whistleblower Aid and the public affairs firm Bryson Gillette.

Like whistleblower Frances Haugen, these Facebook employees warned about the company’s problems for years. No one listened.

As Haugen revealed her identity in a prime-time interview on CBS’s “60 Minutes,” her team launched a glossy website with professional headshots, a biography and links to social media accounts. Her testimony garnered so much attention that she was parodied by actress Heidi Gardner in a “Saturday Night Live” skit over the weekend.

"Saturday Night Live" on Oct. 9 took aim at lawmakers' understanding of social media platforms following following Frances Haugen's Senate testimony. (Video: The Washington Post)

In the days since her Senate testimony, Haugen has appeared on a Yale Law School panel to discuss the Facebook files. Lawmakers, including the Senate Homeland Security Committee, are planning more meetings with her. She’s also planning a trip to Europe, where she will testify in front of the British Parliament, her lawyer John Tye said at a recent Washington Post Live event.

Sen. Richard Blumenthal (D-Conn.), who chaired the Senate subcommittee hearing where she testified, called her “credible and compelling.”

“Frances Haugen wants to fix Facebook, not burn it to the ground,” Blumenthal told reporters after the hearing. “She had constructive recommendations for how to improve it, make it more transparent, enable parents to protect their children.”

Facebook whistleblower’s revelations could usher in tech’s ‘Big Tobacco moment,’ lawmakers say

Chris Messina, a product designer who is known for inventing Twitter hashtags, worked with Haugen at Google, where she was on the Google Plus team, the company’s Facebook competitor. She wanted to make technology better because she loved it, Messina said.

“Things would have to have been really, really bad for her to make the decision to possibly sacrifice her career in Silicon Valley to come out and make these statements,” he said.

Haugen showed a proclivity for speaking out at a young age. At 8, the Iowa City native wrote a letter to her congressman, then-Rep. Jim Leach (R), to express her concerns about a plan to turn a local street into a four-lane road, according to Iowa City Press-Citizen archives. She was worried she wouldn’t be able to walk home from school because she had to cross the street.

After graduating from high school, where she won a statewide engineering competition, Haugen studied electrical and computer engineering at Olin College of Engineering, a small private school in Needham, Mass.

For her senior-year project, Haugen chose independent research topics of “privacy and productivity” and “dynamic social network analysis.” She worked on the project during the 2005-2006 school year, when Facebook, still exclusively available to college students, was only about a year old and the social network Myspace was still dominant.

“It’s interesting to see how much the very topics that she’s talking about today were present in her experience here,” said Mark Somerville, a professor and now provost at Olin College, who supervised Haugen’s project.

After graduation, she was hired at Google and did a brief stint in ads before working on products including Google Plus and Google Books. Messina, her colleague there, said she wanted Google Plus to be better than Facebook — not just a copycat. She envisioned better privacy protections and a more thoughtful way of connecting people, Messina said.

“When it came to building social features for the sake of catching up on Facebook or just being competitive with other social platforms, I think she also resisted that,” Messina said.

Google Plus members value their privacy

Haugen was also “solutions-oriented” in her work, said Natalia Villalobos, who worked with her on Google Plus. Villalobos said Haugen helped her write two patents, both focused on connecting members of a community in a more meaningful way.

“Frances encouraged me to think bigger, to think better for what social could be,” Villalobos said.

Google paid her tuition for an MBA at Harvard Business School, where she graduated in 2011. While there, she worked as an early technical co-founder of Hinge, a buzzy dating app that later raised venture capital funding. Its first incarnation was a Facebook-based app called “Secret Agent Cupid.”

Haugen’s concerns about the role of Facebook abroad were informed by five weeks she spent in Africa in 2010 and 2011, she told The Post. The trip was a mix of pleasure and schoolwork, and she spent part of it on a trek organized by Harvard Business School. She spent time in Egypt in early January, just before its Arab Spring uprising began. She also visited Ethiopia, where she became concerned about the social network’s role in fomenting violence, she said during her congressional testimony.

After school, she began to experience the worsening effects of an autoimmune disorder, which left her with lasting nerve damage. She was diagnosed with celiac disease, and in 2014, she landed in an intensive care unit with a large, slow-growing blood clot. At her lowest point, she lost feeling beneath her knees and couldn’t stand on her toes. She had to use a wheelchair or a walker for most of that year.

The former Facebook employee Frances Haugen told lawmakers Oct. 5 which policies the company can take to make their products safer. (Video: Reuters)

Her illness led to an experience that showed her up close how someone could be radicalized by social media. She hired an assistant while she was ill to assist her with walking and carrying heavy things. He became incredibly important to her because he taught her about fitness and healthy eating and ensured that she swam and walked even when it was difficult. But in 2016 he became radicalized on the Internet, she said at the Yale panel.

“It pushed him to a place where he believed George Soros was running the world economy, and nothing I could do could pull him back from that ledge,” Haugen said.

They’re worried their mom is becoming a conspiracy theorist. She thinks they’re the ones living in a fantasy world.

She joined local-review company Yelp in 2015 as a product manager, according to her LinkedIn page, and worked on the company’s first computer vision project, or a way to train software programs to identify real-world objects. She then went to Pinterest, where she worked on how content was ranked in the home feed.

She said at Tuesday’s hearing that her time at Pinterest working on skin-tone filters informed her concerns about how algorithms can discriminate against minorities, which she called “a major issue for our democracy.”

Pinterest spokeswoman Charlotte Fuller confirmed that the company has invested in what it dubs “inclusive" products, including skin-tone ranges and hair-pattern searches.

In 2019, she joined Facebook to work on civic misinformation. She took the job because she viewed it as an opportunity to make sure others wouldn’t experience the pain she did of losing a friend to online conspiracies, she said at the Yale panel. She also wanted to do her part in the run-up to the 2020 election to prevent a repeat of the foreign interference in the 2016 election, she said.

She went into the company clear-eyed about the problems with social media, which had been extensively covered by the news media in the fallout of the Cambridge Analytica scandal. But she soon learned that the problems were much worse than she realized.

“I thought I knew how bad misinformation was,” she said in a recent Post interview. “Then I learned what it was doing in countries that don’t speak English.”

Everything you need to know about the Cambridge Analytica-Facebook debacle

She said she discovered that problems of misinformation are exacerbated in other countries, and that the version of Facebook in the United States is the “most sanitized.” She said at the Yale panel that she kept learning new information that concerned her, including the scale at which people die in ethnic violence fanned by Facebook’s choices, or the security underinvestment in non-English-speaking countries.

When the pandemic forced a lockdown in California in 2020, Haugen drove across the country, from the Bay Area to Iowa City, to be with her family and worked remotely from there.

An inflection point came, she said, when the company decided to disband the civic integrity team after the 2020 election. Weeks later, rioters attacked the U.S. Capitol after organizing in part on Facebook. During her “60 Minutes” interview, Haugen said she decided that Facebook wasn’t willing to make the investments needed to improve its impact on the world.

Guy Rosen, Facebook’s vice president of integrity, pushed back on Haugen’s claims after the interview aired, saying the civic integrity team had been “integrated” into a larger “Central Integrity team” to apply the work for election issues across the company, including on health-related issues.

Facebook’s Sandberg deflected blame for Capitol riot, but new evidence shows how platform played role

Haugen began to realize she would not be able to change Facebook from the inside. Shortly after the civic integrity team was disbanded, she confided in her friend Fine, the start-up adviser, over a glass of wine while sitting by a backyard fire in the Cole Valley area of San Francisco.

“She wasn’t sure she was having the impact that she came there to have,” said Fine, who had first met Haugen at a party the day before Thanksgiving in 2014. “I did not know at the time that that was code for ‘I can’t put out the fire from inside the house,’ which in retrospect is what she was saying.”

Before leaving Facebook, Haugen gathered key materials by capturing photos of documents shared with employees on Facebook Workplace, which the company uses for internal communications.

In September, the Journal began to publish a series of articles based on Haugen’s revelations, touching on issues including children’s safety, Facebook’s role in polarizing people and the company’s weak response to human trafficking. Haugen’s identity at first remained a mystery, bringing even greater media attention to her “60 Minutes” interview and subsequent appearance before Congress.

Haugen says she’s committed to making social media better. She’s calling on lawmakers to force greater transparency at Facebook, and to consider legislation that would make the company more responsible for what its algorithms prioritize. She also told The Post she’s advocating for soft interventions for the problematic posts on Facebook, such as requiring people to click through a link to read content before sharing it. She said product changes like this could be even more effective in addressing the company’s content moderation problems abroad, especially in vulnerable countries where Facebook hasn’t invested enough in safety.

“What we see in Myanmar, what we see in Ethiopia are only the opening chapters of a novel that has an ending that is far scarier than anything we want to read,” Haugen said at the Yale panel. “I believe that we still have time to have social media that brings out the best in humanity. That’s not going to come about unless we help guide Facebook in that direction and change the incentives.”