A Facebook employee walks past a sign at Facebook headquarters in Menlo Park, Calif. (AP Photo/Jeff Chiu, File)

When Facebook introduced its News Feed in 2006, users quite literally revolted against it.

Petitions to nix the feature racked up tens of thousands of signatures. Early users — mostly college students, at that point, since Facebook hadn’t yet opened up to the general public — left the social network in droves. Columnists at every major college paper in the country wrote eulogies to a site that, they assumed, could not outlive the outrage. News Feed was, at one point, dubbed “this generation’s Vietnam.”

“I lost three friends to the great Facebook massacre of ’06,” Carlos Maycotte wrote in the Cornell Daily Sun. “It was a rough day all around.”

In the eight years since, Facebook’s News Feed — that LED-lit window through which we glimpse news, memes and snatches of other people’s lives — has not exactly gotten less controversial. But the nature of that controversy has fundamentally changed. Where early college users raged against sharing, and seeing, too much information — of being subsumed, in effect, by the social media noise — our anxieties today frequently involve getting too little of it. Facebook’s latest changes to the News Feed, announced just last week, are essentially tooled to give users more content, more quickly.

Both concerns relate to control. Whether we see too much content or too little, everything we see in Facebook’s News Feed is determined by an algorithm — an invented mathematical formula that guesses what you want to see based on who posted it, where it came from, and a string of other mysterious factors known only to the programmers and project managers who work on it.

Too much information

That was less a concern when News Feed was introduced to primordial Facebook’s 9 million users in 2006, just two years after the network launched, and well before it opened to non-student users. Our conception of “sharing,” both in the sense of sending content and receiving it, was fundamentally narrower. We were accustomed to putting photos on our Myspace profiles, for instance, or perhaps sharing tidbits from our lives on Livejournal or Xanga. But in every case, our updates were quiet, siloed — the digital equivalent of speaking softly in a very crowded room. To hear us, friends and curious strangers would have to navigate directly to our corner of the Internet, to our profile, to our albums or blogs.

When News Feed upended that — essentially guaranteeing that every online post was less a conversation, and more a shout from the Internet rooftops — users rioted. They hated this idea not only that their information, so dutifully and naively logged into Facebook all these months, could suddenly belong to a wider public. But more than that, they hated the idea that they’d soon be bombarded with other people’s trivia.


(Facebook)

Users were “overwhelmed,” claimed early reviews. They found the stream feature “creepy.” Newspapers and savvy Web sites published guides on how to hide more content, not less. And Facebook, predictably, trumpeted every change to the News Feed and its underlying recommendation algorithm — the formula that determines what posts, and how many posts, you see — as wins against clutter, against too much noise. When Facebook made the News Feed real-time in 2009, it promised that even those Twitter-like “live” updates would be well-filtered.

“That’s the big secret about controlling the waterfall of content that social networks mercilessly dump on your head every second: You’re not supposed to consume every drop,” Matt Buchanan wrote in The New York Times as late as 2012. “Facebook designed its News Feed, which is the first thing you see when you go to Facebook.com, to deal with this problem.”

“It put the social back in social network,” he concluded.

The trouble with algorithms

But even as Buchanan wrote those words, there was plenty of indication that the News Feed algorithm might not be quite as pro-social as he and many others implied. The year before, Facebook had revamped News Feed to surface content that was “interesting,” not necessarily relevant or new. And earlier that year, Facebook had debuted a roundly criticized feature called “personal promoted posts,” which — like posts promoted by brands and fan pages — let users outsmart the Facebook algorithm … for a price. Seven dollars guaranteed that more people would see your latest job announcement or baby photos. The feature recognized, implicitly, that its algorithms hid content people wanted to see.

“What really stings is that Facebook is now abusing the myopia it created,” Mashable’s Matt Silverman wrote in 2012. “Essentially, the network is ‘hiding’ your updates from friends, and then turning around to say, ‘Hey, if you want friends to see your updates, you could pay us!’”

Only weeks later, a programmer dug up a secret Facebook URL that let users see their unfiltered feed, sans algorithms. When Facebook struck it down — which they did quickly — users circulated an online petition to get rid of Facebook’s algorithm, called EdgeRank, entirely. (Facebook, unsurprisingly, refused.)


(Chris Ingraham/The Washington Post)

But in the two years since, disgruntled users have only become more wary about Edgerank’s quiet influence. Writing in March 2013 — in the same blog where Buchanan trumpeted News Feed — The New York Times’s Nick Bilton provoked a minor PR scandal for Facebook when he claimed his posts mysteriously reached almost no one anymore, an indication of algorithmic tampering. Earlier this year, when Facebook revealed its research team had manipulated the algorithm to show more or less “happy” content in people’s feeds, the Internet had an all-out meltdown — accusing Facebook, in some cases, of inflicting psychological harm on unknowing research subjects.

And months later, when gripping protests broke out in Ferguson, Mo., over the police shooting of an unarmed teenager, critics demanded to know why the subject took so long to trend on Facebook, the site where one-third of Americans reportedly get their news.

“This isn’t about Facebook per se—maybe it will do a good job, maybe not—but the fact that algorithmic filtering, as a layer, controls what you see on the Internet,” wrote the sociologist Zeynep Tufekci. “… Algorithms have consequences.”

What changed since 2006? Certainly not the natural animosity with which people greet each and every change to a familiar Web site — that will always come standard. But I suspect the shift in tone around News Feed is about more than just that knee-jerk Internet outrage. After all, we supply so much more information now that we did even one or two years ago: according to “Zuckerberg’s Law” — a theory named for, and articulated by, the Facebook founder himself — people share twice as much every year than they do the year before. There are also more people on Facebook: 1.32 billion active users today, compared to roughly 10 million when News Feed launched. It’s not just more ordinary users, either: There are more celebrities. More news organizations. More brands. So when we miss something now, we’re talking about loss on a much greater scale: hundreds or thousands of posts, as opposed to dozens, and with far more interesting or actionable content than, say, the results of the last Buzzfeed quiz your cubicle-mate took.

In 2012, Buchanan wrote, Internet-users celebrated the “joy of missing out.” As a million T-shirts and hashtags now attest, our attitude on that score has fundamentally reversed. It surely helps that, in the past two years, we’ve become all too aware of the insidious technological forces that watch and shape us: the NSA, data brokers, the dreaded “filter bubble” that Eli Pariser first identified in 2011, and has since come to describe a slate of fears regarding literacy, bias and how unseen algorithms influence both.

Everybody’s doing this,” Christian Rudder, the data scientist behind OkCupid — and an outspoken defender of Facebook’s algorithmic tweaks — stressed at a lecture in D.C. Monday night. When you Google something, an algorithm surfaces some results and buries others. When you look for a date, OkCupid necessarily hides thousands of people. You will never meet them; they may as well not exist.

And while that type of filtering may very well be necessary — Facebook would be unusable without it, a product manager told Mashable in 2012 — the idea that we do not choose it, that we do not control it, still unsettles.

“If a Facebook user posts and the algorithm decides that no one should hear it,” wrote The Atlantic’s Alexis Madrigal in 2010, “did he really write something?”

Four years later, News Feed and its algorithms have changed a hundred times. But that question, at least, still haunts us.

Related: What Facebook doesn’t show you