That our ignorance is their bliss is among the unique signatures of a new economic logic that I call surveillance capitalism.
Capitalism has always evolved by claiming things that exist outside the market and bringing them into the market for sale and purchase. This is how we turned making a living into “labor” and nature into “real estate.” Surveillance capitalism now claims private human experience as free raw material for translation into behavioral predictions that are bought and sold in a new kind of private marketplace. And it takes place almost completely without our knowledge.
Competition among surveillance capitalists has produced startling innovations that challenge the foundations of a democratic society.
This unprecedented economic logic was invented almost by accident during the financial emergency of the dot-com bust, when the founders of Google discovered that the “data exhaust” clogging their servers could be combined with powerful analytic capabilities to produce reliable predictions of user behavior. Originally, the key behavior of interest was whether a user might click on an online ad.
The young company’s ability to turn these surplus data into reliable click-through predictions became the basis for a lucrative sales process known as ad targeting. With targeting, advertisers could buy Google predictions on what we would do now, soon and later. The idea was deployed beginning in 2001 in the strictest secrecy. Only when Google went public in 2004 did the world hear about it — and learn that partly on the strength of these operations, Google’s revenue had increased by 3,590 percent.
Surveillance capitalism soon spread to Facebook and other Silicon Valley companies. But it is no longer the exclusive domain of the tech sector. It has now invaded every economic domain — automobiles, insurance, entertainment, finance, retail, health, real estate and more: anywhere a product or service begins with the word “smart” or “personalized.” The reality is that we are shifting into a new surveillance-based economic order in which our private experience becomes the free raw material for markets that trade in predictions of our actions.
Surveillance capitalists soon discovered that they could use these data not only to know our behavior but also to shape it. This became an economic imperative. It was no longer enough to automate information flows about us; the goal became to automate us. As one data scientist explained it to me, “We can engineer the context around a particular behavior and force change that way. . . . We are learning how to write the music, and then we let the music make them dance.”
It works like this: Ads press teenagers on Friday nights to buy pimple cream, triggered by predictive analyses that show their social anxieties peaking as the weekend approaches. “Pokémon Go” players are herded to nearby bars, fast-food joints and shops that pay to play in its prediction markets, where “footfall” is the real-life equivalent of online clicks.
This digitally informed behavior modification is carefully designed to bypass our awareness. It robs us of autonomy, of the freedom to choose our actions and of the right to say “no,” eroding democracy from within. And it’s a one-way mirror: These firms know everything about us, while their operations are unknowable to us. Their predictions are about us but not for us.
Apple chief executive Tim Cook made this point recently while calling for a more comprehensive approach to privacy legislation: “Right now, all of these secondary markets for your information exist in a shadow economy that’s largely unchecked — out of sight of consumers, regulators and lawmakers.”
Democracy has slept while surveillance capitalism has flourished. Elected officials determined to rein in the digital titans must understand that surveillance capitalism is bigger than any single company. Regulation will require a new framework that strengthens our understanding of privacy rights. We will need to interrupt and in some cases outlaw (1) the unilateral claim to private human experience as a free source of raw material and its translation into data; (2) the exclusive concentrations of knowledge illegitimately gleaned from our behavior; (3) the manufacture of computational prediction products based on the secret capture of our experience; and (4) the sale of behavioral prediction products.
Our democracy has successfully confronted many excesses of unchecked capitalism, outlawing child labor, uninspected food and unfair wages. Today we face a similar challenge in curbing the excesses of a rogue surveillance capitalism. It is not the work of a day or a year, but it is necessary work, and we must be up to the task, because the alternative promises dangerous consequences for human freedom and democracy.