I saw the movie about a year later, when it received a New York release, a few months after a Harvard dropout named Mark Zuckerberg gave an interview to CNBC about his new business. “Maybe we can make something cool,” Zuckerberg said, describing “The Facebook,” as it was called then, as a way for people to enter information about themselves and “connect with friends.”
CNBC has never been the sort of outlet to ask a CEO and founder if his business model was psychopathic. But it’s certainly a conclusion we can consider now. The New York Times’s newest investigative blockbuster on Facebook revealed that the company shared personal information — including, in some cases, private messages — with a range of corporations including Netflix, Spotify the Royal Bank of Canada, Pandora and, ahem, the New York Times and Amazon. (Amazon.com founder and chief executive Jeffrey P. Bezos owns The Post.) The sharing occurred even if consumers proactively opted out of allowing outside companies to view it. The Times even discovered the Russian search engine Yandex had access to some data. It seems quite likely this violated a 2011 consent decree Facebook entered into with the Federal Trade Commission saying it wouldn’t share such user-generated content without getting their explicit agreement to do so. It also seems to contradict Zuckerberg’s statement to Congress earlier this year that Facebook’s users remain in control of how their information is parceled out.
You hardly need me to tell you at this point that Facebook is a serial violator of all sorts of norms, and that if it were a person, its many scandals and way of handling them easily meet the criteria for a clinical psychopath. Our natural desire to connect with friends from our past and simultaneously make new acquaintances made Zuckerberg a multibillionaire as the company he created in a Harvard dorm room sold the information — sometimes to less-than-savory characters — almost all of us all too willingly gave it. At the same time, Facebook’s top executives turned all but a blind eye to such things as use of its platform and WhatsApp messenger app to incite ethnic violence in countries including India, Myanmar and Sri Lanka; attempts by Russia to manipulate the 2016 elections in the United States; and the use of Facebook by white nationalists.
But Facebook and the moral rot exhibited by its executive leadership exist within a broader corporate culture. In the past week, we’ve also discovered that:
- Employees of corporate consultant McKinsey helpfully identified leading dissidents to the Saudi government.
- Executives at pharmaceutical giant Johnson & Johnson knew that its talc powder contained cancer-causing asbestos and refused to do anything about it.
- Changes in coal mining practices sparked a resurgence of black lung among coal miners while government regulators looked the other way.
- Sears received permission from a federal bankruptcy judge to pay high-ranking executives $25 million in bonuses while cutting off thousands of employees without a penny in severance pay.
And that list is likely incomplete.
Over the past several decades, it’s become common for companies to define their best interests in an extreme bottom-line way, with little concern given to their employees, customers or the general public. But when courts declared a corporation a person, they didn’t acknowledge that corporations, not being people, do not possess a moral compass. Over time, this has not infrequently attracted people missing a few moral rungs to aspire to their leadership. It’s possible it’s baked into the job description: A 2016 paper found one in five American senior corporate executives could be diagnosed as psychopaths. (In an ironic twist, the paper was subsequently retracted over plagiarism charges. The research in the paper was not in question.)
Moreover, corporations are frequently protected by both law and the power of their money in ways individuals are not. Over the past two decades, government regulators have been loath to go after corporations in such a way that fines either do substantive damage to their bottom lines or put them out of business altogether — while at the same time they’ve also pulled back from prosecuting top executives for their violations of law and regulation. Watchdogs — both in government and outside of it — are underfunded and outmanned.
It’s a triumph of hope over experience to assume Facebook — or really, any multibillion-dollar corporation — will address their failings on their own. They’ve been financially rewarded for them for way too long. We need government to step up in a much more aggressive way than agencies have in the past. In the case of Facebook, this could mean everything from privacy legislation with muscle — something the United States happens to lack — to declaring the business a public good and legislating it like a utility of old, with demands it open up its network to outsiders. But there is a bigger picture too, one that applies more broadly: If corporations are people, they need to abide by the standards of decent, civilized behavior or face the consequences.