Facebook, in its vast, overlordly benevolence, wants its 1.32 billion serfs to believe that it is ridding News Feed of “click-bait” on their behalf.
We will protect you from unscrupulous, overhyped headlines, they assure us. We will surface only “valuable” information, the kind you want to e-mail to your mother or tweet on the #longreads tag.
But the truth is a little more nuanced, as truth so often is. It seems like Facebook is just moving to address a very classic Internet-recommendation puzzle, one also faced by everyone from Netflix to OkCupid: How do you give people what they actually want, versus what they say they want? The two are rarely the same, but the company that can see past the latter to provide the former boasts a huge advantage over its competitors.
There’s a secondary question here, as well: What DO people actually want — the #longreads, or the videos that “give you all the feels”?
Interestingly, there’s a good analog for this conundrum in the hot and heavy field of online dating. Early dating sites, including behemoths like Match, paired users according to questionnaire results — i.e., what people said they wanted. A responsible nonsmoker, maybe, with a professional job. Or a nice boy/girl-next-door who could be brought home to Mom.
But as Dan Slater chronicles in his lengthy history of the industry, when many of these matches met, they didn’t really … match. Some attributed the disconnect to “chemistry,” or the Internet’s inability to account for it. But others, like the prophetic co-founders behind free-dating upstart OkCupid, had another theory: Daters did not actually know, or at least couldn’t articulate, what kind of people they were really interested in.
Hence, instead of merely asking users whom they were looking for, OkCupid began monitoring what people actually did on the site: what profiles they clicked, which messages they responded to, how many messages they exchanged. The site developed an entire wing dedicated to parsing this data. And based on that data, OkCupid served up recommendations that did not necessarily match the user’s stated preferences, but (theoretically, at least) did match his demonstrated preferences.
Get it? With this latest change to its News Feed algorithm, Facebook is essentially pulling an OkCupid. It has always tried to give you what you want, based on your site behavior. Now, it posits, there may be a better way to measure that: Maybe the time you spend on an article is a more accurate demonstration of your demonstrated preferences than something like a “like” or a click is.
We don’t know, frankly, if that’s true. (A great number of media reformers and pundits have certainly argued for engagement as a better online success metric.) What’s more important, and more problematic, is that we don’t know if our actual, demonstrated preferences are for or against click-bait, that much-maligned genre that taunts readers with unclear, histrionic and overhyped headlines — following them, in many cases, with equally fluffy “articles.”
Consider the news topics people say they want to read: national, local, the economy. Then consider the most popular Web articles in the past year: “27 shocking and unexpected facts you learn in your twenties.” “24 photos you need to really look at to understand.”
The Atlantic’s Derek Thompson puts it this way: “Ask audiences what they want, and they’ll tell you vegetables. Watch them quietly, and they’ll mostly eat candy.”
Admittedly, these comparisons measure article popularity by clicks, the very metric Facebook says it’s trying to get away from. So perhaps, as Facebook professes to hope, people will click into hefty, longform pieces on stuff like protests in Ferguson, Mo., or the national economy, and they’ll read them all the way through, and those links will proliferate in News Feed.
Alternately, people will click into said longform piece, skim it, X-out after 10 seconds, and go on to spend 10 minutes lol-ing over cats that have had ENOUGH, thank you. That is not what people say they’ll do, of course. (Facebook ironically invokes a recent survey, in which 80 percent of respondents said they wanted less click-bait.) But it is, in all possibility, how their actual behavior will play out.
Of course, on some level, this is all irrelevant. We won’t know how people are acting, and we certainly won’t know how, or how much, it alters what we see. In fact, that’s arguably the most important takeaway from every and any change to Facebook’s News Feed: No matter how small or valuable or beneficent it seems, the algorithm is ultimately inscrutable. It is a black box. Users have never known how or to what degree it manipulates their information diet, and they will continue not to know — missing tens of thousands of posts, click-bait and otherwise, for reasons that Facebook never fully explains.
Facebook may be a benevolent overlord, sure, and this change could — potentially! — improve News Feed. But when all is said and done, it is still an overlord. And that may make Facebook a more dangerous villain than even the most shameless purveyors of click-bait.