Years ago, when Facebook was barely out of the dorm room, most Americans believed world-broadening innovations such as social media were inherently good. But openness ushered in ugliness along with the democratization that platforms’ founders had promised. Misinformation ran rampant. Consumers paid for these public and open services with their data, in troves. The solution until now for chief executives still committed to early-days idealism was to build structures shoring up their services against myriad threats. Mr. Zuckerberg is presenting an alternative: give up and try something new.
Exactly how dramatic Facebook’s shift will be remains to be seen. The idea is to start with end-to-end encrypted messaging by merging communications across Facebook, WhatsApp and Instagram, and then to build on top of that whatever services the company can include. What that means for the news feed, and on what timeline, is unclear.
Facebook’s changes may be motivated in part by a desire to avoid European regulators’ attempts at breaking apart its properties. They may be motivated, too, by declining use, especially among young people, and a need to search for alternative revenue streams.
But the platform is also moving away from what makes it Facebook because being Facebook has become too difficult. Privacy advocates demand that Facebook stay away from their information, while critics want the platform to crack down on those who have turned it dangerous. Playing lawmaker, cop and court all at once comes at the cost of political controversy and even human harm. Content moderators, the Verge’s Casey Newton reported last week, are being diagnosed with post-traumatic stress disorder from sifting through beheadings, suicide footage and child pornography.
Facebook is not giving up on its business, but it is giving up on its vision. Whether focusing more on private conversations and less on public displays will be better for society is hard to say. Misinformation thrives on WhatsApp already, in part because it is impossible for a platform to police content that it cannot see. Now, if things get bad, Facebook won’t be able to see it either. The ugliness won’t go away, yet Facebook’s eyes will be closed.