And that is one big reason Ben Grosser, an artist and a professor at the University of Illinois at Urbana-Champaign, who describes his work as “writing software in order to investigate the social effects of software,” made an extension that randomizes what you tell Facebook about how you’re feeling. It’s called Go Rando, and — I’ll be honest — it made Facebook likes a little terrifying for me when I first installed it.
Here’s how it works: With Go Rando running, each time you click like on a post from your news feed, the extension intercepts that like and randomly selects one of Facebook’s six reactions. That reaction is what shows up on the post. Click like, and you might end up loving the post. Click like on another post, and it might show up as sad.
With Go Rando running, I clicked like on a post from a friend, who was passing along a list of job openings at his workplace. My reaction showed up as sad. I was “angry” about a romantic Valentine’s Day post from one of my closest friends. Thankfully, like any “like” on Facebook, Go Rando likes let you unlike as well. And, if you want to pick your reaction for, say, a more sensitive post from a friend, the extension will let you do that, in exactly the same way that you choose an emotion on Facebook without the extension getting involved.
But Go Rando works closest to Grosser’s intentions when you interfere with it the least. The idea is, essentially, that all those random emotions confuse the data you give Facebook. By creating random reactions to the posts you like, you stop telling Facebook how you genuinely feel. With enough use, you’ll appear more or less “balanced” to the site, Grosser said in an interview.
“It disrupts the usefulness of this emotional reaction data that Facebook is collecting,” he said. “It disconnects how we feel from the data that’s getting recorded. No longer will my emotions line up neatly with easy analytic pictures of my personality. Over time, as you use Go Rando, instead of creating a picture of ‘Oh, he likes those things. Oh, he hates those things,’ instead it’s going to look like I’m neutral.”
Grosser uses the term “emotional surveillance” to talk about what Go Rando is supposed to confound. “Emotional surveillance means using our activity online to ascertain how we feel and to connect how we feel with our interests, our hopes, our fears, in order to more effectively analyze our personalities,” he said. And that sort of personality analysis is useful “for the purposes of message targeting, predictive analytics” and a bunch of other things.
As of last year, Facebook said it did not use reactions data in its news feed, but it has also made it clear in the past that it intends to use it for this purpose some time in the future. Facebook said it had no updates on its use of reactions data in the News Feed when we reached out this week.
We already know that Facebook is interested in how you feel – and that likes in general are a driving force behind the algorithms that determine what you see in your news feed. The very purpose of the like button, as Slate’s Will Oremus reported last year, is “to enlist its users in solving the problem of how best to filter their own news feeds,” without even thinking about it.
For Grosser, that’s a phenomenon worth examining. “How we feel, or how we report how we feel on Facebook, has all kinds of implications for what we see in the future on Facebook,” Grosser told me. By randomizing what you report to the site, you’re essentially telling Facebook that you feel each available emotion about the same number of times, and not necessarily for the same sorts of posts. One day, for instance, you might love a post containing a link to a news story that is critical of President Trump. The next day, you might be angry at a similar thing.
Facebook’s previous interest in emotional manipulation is one reason Grosser decided to create a project examining these reactions. In 2014, Facebook revealed that it had run a highly controversial study on a subset of its users: It manipulated what they saw on their news feeds, showing some users more positive posts than normal and showing others more negative posts. Then, Facebook used keywords from those users’ status updates to determine their overall mood. (The study, in case you were interested, found that the users who saw more positive posts had more positive status updates and that those who saw more negative posts had more negative updates. You can read more about it here.)
Grosser’s work is generally based on a simple concept — adding to, or removing, a normal part of an online experience to make you think about why it’s there in the first place. He made another Facebook-based extension, for instance, that removes all the numbers from the platform. The Facebook Demetricator became the subject of a research paper by Grosser that highlighted the worrying importance quantification plays in feeding our addiction to social media. Go Rando is also in that vein: The app contains an element of chaotic fun (and a bit of mild social danger), but it’s also designed to make you think about how you react on Facebook in the first place and who benefits from you doing it. “Liking has become a compulsive behavior,” Grosser said. “Go Rando makes you pause.”
The things you tell Facebook tell the company about you. But they also tell any consumer of big data – advertisers, political campaigns, anyone who might want to target a particular demographic with a message or product – things about you, too. And as a reward for all that useful knowledge, Facebook gives the user a feed that’s designed to display things that will keep them sharing, liking and loving. That feed can easily become a bubble that shows you the world as you’d like to see it — as opposed to the world as it is.
Although Go Rando still tells Facebook that you reacted to something at all, confusing random emotions or not, my short experience using it definitely helped me pause each time to think about why.