But these are just still photographs. The power of a 3D virtual self is that it can be animated. Imagine if LinkedIn didn’t just simply put your picture next to a job description of a ballerina but showed you prancing around doing spins and jumps.
It cost almost half a million dollars to build the holographic Tupac that danced onstage last spring. But a Microsoft product called Kinect, heralded by Guinness World Records as the fastest-selling consumer electronic device ever, can build the virtual self for about $100, and in no time at all. During a recent visit to Microsoft Research, I watched a scientist twirl the Kinect sensor around my head, and a minute later, I saw a projection of my virtual doppelganger staring right back at me — visible from any angle, able to perform any action fathomable by an animator, capable of being copied a million times for free and never to degrade so long as electricity exists.
There are obvious downsides to these applications. For example, this technology allows for a visual version of libel: Instead of making up gossip about people, you could create video-quality images of their virtual selves doing unseemly things. In the hands of a disgruntled employee or an ex-lover, the possibilities are daunting.
What does the legal system have to say about all this? The picture is murky at best. On the one hand, existing laws may serve perfectly well as deterrents and remedies; for example, copyright precedents should apply equally to virtual images and real ones. If a hologram’s actions hurt someone’s reputation, presumably current libel and slander laws would offer protection.
On the other hand, the Supreme Court has already carved out exceptions for virtual humans. Perhaps the most notable example involves virtual child pornography. In 2002 the court ruled, 6 to 3, that Congress could not criminalize the creation of computer-generated images of children engaged in sexual acts because there was no harm to an actual child. More recently, in 2011, Justice Samuel Alito concurred with a ruling recognizing that the right to free speech protects violent video games, but he wrote a separate opinion to flag the unique nature of virtual media, arguing that users’ experiences “just might be very different from reading a book, listening to the radio, or watching a movie or a television show.”
If virtual media is different, and if doppelgangers are the future, then we can’t think of them as exceptions. Our doppelgangers are us, and they need protection, too.
To say technology is moving quickly is obviously an understatement. Technology is almost always moving quickly. That’s why it’s past time to confront the implications of a futuristic creation that’s already here.
Read more from Outlook, friend us on Facebook, and follow us on Twitter.