wpostServer: http://css.washingtonpost.com/wpost2
ideas@innovations
About Dominic | About Vivek | About Emi | E-mail Us E-mail |  On Twitter Follow |  On Facebook Fan |  RSS RSS Feed
Posted at 11:07 AM ET, 12/10/2012

The ‘cool or creepy’ test for innovators


If you think about some of the most exciting areas of innovation today, whether it’s 3D printing, synthetic biology, artificial intelligence, augmented reality or facial recognition, they all have at least one thing in common: they can be seen as either “cool” and “creepy.” Facial recognition, for example, is “cool” when it helps one find a lost child in a crowd, but it’s “creepy” when the government uses it for surveillance. These “cool or creepy” innovations are even more exciting than many “disruptive” innovations of the early Internet era because they have the potential to disrupt not just one industry, but an entire society.

As recently as twenty years ago, innovations were simply expected to be bigger, faster and cheaper. Then, during the early Internet era, innovations needed to be billion-dollar ideas with the ability to create entirely new markets. Now, ideas and innovations need to be so far-reaching that they challenge some of our fundamental beliefs about human existence.

This means that if you consider yourself to be an innovator today, you should be asking yourself, “Is my cool innovation potentially creepy too?” If it’s not creepy, it’s most likely just another incremental innovation, rather than something that will change the world.

So what does it mean to be a “creepy” innovation? Paraphrasing Supreme Court Judge Potter Stewart, you know creepy when you see it. It usually triggers one or more ethical questions (e.g. “Do human beings have the right to create a new life form?”), followed by one or more philosophical questions (e.g. “What does it mean to be human?”). It may also challenge one or more assumptions about our fundamental rights, such as the right to privacy.

The “cool or creepy” test is relevant for innovators today because we’re living in an era when just about anything can be digitized. And, once it has been, remarkable things can happen, thanks to the growing power of computers.

Take 3D printing, for example. Who would have thought that the way we download music and movies would be almost identical to the way we acquire design layouts for objects we can create in the comfort of our own homes. Or, how about the next wave of biotech startups that are building on efforts by Craig Venter to map the human genome? We’re getting closer to transforming the genetic code into a "digital life code," in which just about anything is possible. We are starting to see remarkable experiments in synthetic biology, where we can build customized organisms from scratch, all by digitally mapping DNA. Then there’s Stewart Brand. Brand has launched a radical new "de-extinction project" with the goal of bringing an extinct species back to life.

Even the functioning of the human brain can be reduced, ultimately, to a series of 1’s and 0’s. A computer brain can now beat a human brain at chess and at Jeopardy! The next step is for a computer brain like IBM’s Watson to take on the medical field, transforming medicine into an almost purely data-driven discipline. And, after that, according to leading thinkers of the Singularity, such as Ray Kurzweil, it’s time for the computer brain to somehow merge with the human brain. By that time, we will have crossed way, way beyond the “creepy” line as we know it today.

Things are going from “cool” to “creepy” at a rapid pace, and technology, as a result, has progressed to a point where ethics and philosophy are just as important as coding skills for the next generation of innovators.

Read more news and ideas on Innovations:

The mobile bystander

To the moon!

A big data royal

By  |  11:07 AM ET, 12/10/2012

 
Read what others are saying
     

    © 2011 The Washington Post Company