There are lots of ways to think about Tuesday’s surprise European court decision, which ruled that search engines must remove unflattering Google results on an individual’s request. It’s a thorny legal issue, with plenty of ripples for freedom of information and the press. It’s a business and logistical headache for search engines like Google, who now have to figure out how to carry out the court’s ruling. Most intriguingly for our purposes, it’s a conceptual puzzle, of sorts: Europe’s Court of Justice is calling this “the right to be forgotten,” without saying what — or who — is actually doing the forgetting.

And that passive construction is really interesting. Presumably, when the court refers to memory in this way, it’s talking about the search engines themselves: They have to take the results down — they, in effect, have to “forget.”

But that frame conveniently skirts two very important realities about how memory and the Internet work. For starters, as Google itself has argued, even if the search engine fails to display a web page, that page still exists in the ether in some form or another and is readily accessible by other means. (Which means it’s not quite forgotten, right?)

More problematically, the ruling underestimates, or perhaps misinterprets, the role that Google and its ilk play in our collective consciousness. If Google “forgets” something, it’s not merely Google that forgets — it’s millions of people across a distributed network, all of them engaged in something called “transactional memory.” Sound familiar? Lots of studies have been done on this subject, and they’ve all amounted to one striking conclusion: Technology and the Internet have fundamentally changed the nature and function of memory. Where people once outsourced memory tasks to other people or to notebooks or to those types of analog things, they now rely on Google to remember for them. Unless Google “forgets.” In which case, we all forget, too.

That sort of collective amnesia was already quite possible, thanks to the algorithms and internal policies that make Google’s inner workings more or less impenetrable to the outside observer. As Clive Thompson explained in his 2013 book “Smarter Than You Think,” that has frightening implications for media literacy — among other things:

If there’s a big danger in using machines for transactive memory, it’s not about making us stupider or less memorious. It’s in the inscrutability of their mechanics. Transactive memory works best when you have a sense of how your partners’ minds work—where they’re strong, where they’re weak, where their biases lie. I can judge that for people close to me. But it’s harder with digital tools, particularly search engines.

In fact, that’s the great irony of the court’s decision: By ruling that Google had to alter its “memories” for some, it essentially ruled that it should become less scrutable and less transparent for others. Because an individual has the “right to be forgotten,” everyone else has a legal obligation to forget.

Maybe that’s good for privacy policy, and maybe that’s good for individual rights. But collectively? We’re lobotomized. We don’t even know what we forgot.