The Washington Post

Europe’s highest court says people have ‘the right to be forgotten.’ But online, forgetting doesn’t exist.

There are lots of ways to think about Tuesday’s surprise European court decision, which ruled that search engines must remove unflattering Google results on an individual’s request. It’s a thorny legal issue, with plenty of ripples for freedom of information and the press. It’s a business and logistical headache for search engines like Google, who now have to figure out how to carry out the court’s ruling. Most intriguingly for our purposes, it’s a conceptual puzzle, of sorts: Europe’s Court of Justice is calling this “the right to be forgotten,” without saying what — or who — is actually doing the forgetting.

And that passive construction is really interesting. Presumably, when the court refers to memory in this way, it’s talking about the search engines themselves: They have to take the results down — they, in effect, have to “forget.”

But that frame conveniently skirts two very important realities about how memory and the Internet work. For starters, as Google itself has argued, even if the search engine fails to display a web page, that page still exists in the ether in some form or another and is readily accessible by other means. (Which means it’s not quite forgotten, right?)

More problematically, the ruling underestimates, or perhaps misinterprets, the role that Google and its ilk play in our collective consciousness. If Google “forgets” something, it’s not merely Google that forgets — it’s millions of people across a distributed network, all of them engaged in something called “transactional memory.” Sound familiar? Lots of studies have been done on this subject, and they’ve all amounted to one striking conclusion: Technology and the Internet have fundamentally changed the nature and function of memory. Where people once outsourced memory tasks to other people or to notebooks or to those types of analog things, they now rely on Google to remember for them. Unless Google “forgets.” In which case, we all forget, too.

That sort of collective amnesia was already quite possible, thanks to the algorithms and internal policies that make Google’s inner workings more or less impenetrable to the outside observer. As Clive Thompson explained in his 2013 book “Smarter Than You Think,” that has frightening implications for media literacy — among other things:

If there’s a big danger in using machines for transactive memory, it’s not about making us stupider or less memorious. It’s in the inscrutability of their mechanics. Transactive memory works best when you have a sense of how your partners’ minds work—where they’re strong, where they’re weak, where their biases lie. I can judge that for people close to me. But it’s harder with digital tools, particularly search engines.

In fact, that’s the great irony of the court’s decision: By ruling that Google had to alter its “memories” for some, it essentially ruled that it should become less scrutable and less transparent for others. Because an individual has the “right to be forgotten,” everyone else has a legal obligation to forget.

Maybe that’s good for privacy policy, and maybe that’s good for individual rights. But collectively? We’re lobotomized. We don’t even know what we forgot.


Caitlin Dewey is The Post’s digital culture critic. Follow her on Twitter @caitlindewey or subscribe to her daily newsletter on all things Internet. (
Show Comments
Most Read



Success! Check your inbox for details.

See all newsletters

Close video player
Now Playing

To keep reading, please enter your email address.

You’ll also receive from The Washington Post:
  • A free 6-week digital subscription
  • Our daily newsletter in your inbox

Please enter a valid email address

I have read and agree to the Terms of Service and Privacy Policy.

Please indicate agreement.

Thank you.

Check your inbox. We’ve sent an email explaining how to set up an account and activate your free digital subscription.