“I think in the back, deep, deep recesses of our minds, we kind of knew something bad could happen,” said Palihapitiya, 41. “But I think the way we defined it was not like this.”
That changed as Facebook’s popularity exploded, he said. To date, the social network has more than 2 billion monthly users around the world and continues to grow.
But the ability to connect and share information so quickly — as well as the instant gratification people give and receive over their posts — has resulted in some negative consequences, according to Palihapitiya.
“It literally is a point now where I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are,” he said. “The short-term, dopamine-driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, mistruth. And it’s not an American problem. This is not about Russian ads. This is a global problem.”
Facebook has pushed back on the former executive’s comments, saying in a statement Tuesday that Palihapitiya has not worked there for more than six years and that it was “a very different company back then.”
Palihapitiya, a venture capitalist and part owner of the Golden State Warriors, made his remarks at a talk for Stanford Graduate School of Business students in November. Video of the talk was widely shared again this week after the Verge reported on his comments Monday.
Though he didn’t have immediate answers on how to permanently correct the problem, Palihapitiya encouraged students to take a “hard break from some of these tools and the things that you rely on.” He added that he has posted on Facebook only a handful of times over the past several years and didn't allow his children to use “this sh-t” either, referring to social media platforms.
“Everybody else has to soul-search a little bit more about what you’re willing to do,” he said. “Because your behaviors, you don’t realize it, but you are being programmed. It was unintentional, but now you gotta decide how much you’re willing to give up, how much of your intellectual independence.”
The problem is not isolated to Facebook, he said, citing other social media platforms such as Instagram, Twitter and Snapchat. Palihapitiya pointed to a hoax in India that had spread through WhatsApp and led to the lynching of several men who were falsely accused of being child traffickers.
“Bad actors can now manipulate large swaths of people to do anything you want,” he told the audience. “And we compound the problem. We curate our lives around this perceived sense of perfection, because we get rewarded in these short-term signals — hearts, likes, thumbs up — and we conflate that with value and we conflate it with truth. And instead, what it is is fake, brittle popularity that’s short-term and leaves you even more, admit it, vacant and empty before you did it. . . . Think about that, compounded by 2 billion people.”
After leaving Facebook, Palihapitiya went on to found Social Capital, a venture capital firm that invests in education and health-care businesses often neglected by Silicon Valley. In his wide-ranging Stanford talk, he also addressed using money as an instrument of social change. While he noted that Facebook “overwhelmingly does good in the world,” Palihapitiya also said one of the ways he has reconciled his guilt over growing the platform has been to invest money in diabetes, education and climate-change research.
As the Verge reported, Palihapitiya joined a chorus of former Facebook investors and employees now expressing regret over their contributions to the company:
In November, early investor Sean Parker said he has become a “conscientious objector” to social media, and that Facebook and others had succeeded by “exploiting a vulnerability in human psychology.” A former product manager at the company, Antonio Garcia-Martinez, has said Facebook lies about its ability to influence individuals based on the data it collects on them, and wrote a book, Chaos Monkeys, about his work at the firm.
Most recently, the company was accused of trying to exploit children and eroding their privacy after it launched an app last week called Messenger Kids. Facebook has claimed that it will not display ads on Messenger Kids or use its data for advertising on Facebook.
Facebook has also been criticized heavily for how it regulates — or doesn’t regulate — the content and origin of ads on its platform, especially when it came to thousands of Russian ads that were created to influence voters in the 2016 U.S. presidential election. After some initial resistance, the company turned over thousands of Russian ads to Congress this fall.
Facebook founder and chief executive Mark Zuckerberg had mostly played down the company’s responsibility to monitor and curate its content, saying it is not a media company. Notably, though, at the end of Yom Kippur this year, Zuckerberg posted an apology on his Facebook account “for the ways my work was used to divide people rather than bring us together” and vowed to do better.
In a statement to The Washington Post, a Facebook spokesman said the company is willing to reduce its profits to “make sure the right investments are made.”
“When Chamath was at Facebook we were focused on building new social media experiences and growing Facebook around the world,” the statement read. “ . . . as we have grown, we have realized how our responsibilities have grown too. We take our role very seriously and we are working hard to improve. We’ve done a lot of work and research with outside experts and academics to understand the effects of our service on well-being, and we’re using it to inform our product development.”