washingtonpost.com
The Next Big Sensation?
Researchers Are Working to Make Touch Technology Feel Like the Real Thing

By Joel Garreau
Washington Post Staff Writer
Monday, December 15, 2008

Did you know that those few places on your body where you cannot grow hair are by far the most sensitive? Like the bottoms of your feet?

That's why the young woman with the metal probe is scratching away at a rough surface in a Johns Hopkins University lab. Suppose you wanted to know what something thousands of miles away felt like -- as easily as you could see what it looks like by aiming a remote Internet camera. What happens if that smart probe transmits the sensation to all those dense nerve receptors along your tingly arch?

After all, there are some occasions when only touch will do, aren't there?

This has been the year computers began to deliver feelings to us in a mainstream way. Following their uncanny ability first to interact with our eyes via screens and then our ears through speakers, now tens of millions of them are acquiring touch feedback. You touch the machine, it nuzzles you back.

Feel matters. It's the pea under the princess's mattress. "The world is going digital, but people are analog," says Gayle Schaeffer of Immersion Corp., a leader in touch feedback. "We like real things. We touch real things all day long. We need to interact with something that feels real. In the digital world, touch is so much more personal and private and non-intrusive."

Feeeeeeeellllings, woo-o-o,

Feeeeeeeellllings, woo-o-o,

Feeeeeeeellll you again in my arms.

* * *

Computer screens that you can usefully touch are as common as ATMs and airport check-in kiosks. With the explosive popularity of the Apple iPhone, it became clear that soon, everyone was going to have a touch screen in her pocket.

Indeed, the touch-surface juggernaut marches relentlessly toward the day when push buttons that physically move in and out are gone forever. Already being conquered are televisions, washers, ovens, printers and workout machines, says Steve Koenig, director of industry analysis at the Consumer Electronics Association. Touch screens are now invading dashboards, desktop phones, remote controls, music players, navigators and cameras.

Touch surfaces make things much cheaper and more convenient for the manufacturer. Resistance is futile. Although, to be fair, with great design the opportunity for cool exists.

The big problem is that no matter how much you gussy it up, touching a flat computer screen feels like touching a flat computer screen. It can have as many flashing, beeping pictures of buttons as you like, but there's something about the human brain that doesn't trust those little icons. We mash them again and again, our primal lizard ape brains not believing those icons are actually responding to us -- because it feels all wrong.

Now we're trying to solve that. The multibillion-dollar goal is for smart devices to make our fingers feel as if they are actually working with the good old three-dimensional physical objects that evolution has taught us to trust.

That's why competitors to the iPhone are focusing on the main thing it has yet to offer. Advertising directors for the BlackBerry Storm are doing their level best, this holiday season, to make sure you know that their product is not just touchy, but touchy-feely. Hit its screen and you get a hint of a tactile response. This means a lot.

We've been trained to savor the feel of physical objects. Pull the lever of a slot machine and you get a ker-chunk as rewarding as with the lever of a Winchester. Artists obsess over the difference in snap between squirrel-hair brushes and sable-hair brushes. A fisherman knows the feel of his favorite rod as precisely as the golfer knows her putter. No chef would ever put up with a badly balanced knife blade.

Real or Imagined?

Touch can be spoofed. Cold spaghetti and the power of suggestion can make blindfolded people believe they're being covered with worms, and you can convince visitors to a "haunted house" that they're feeling eyeballs when they're actually touching peeled grapes. That's basic to the science and magic of touch, says Allison Okamura, director of the Haptics Laboratory at Johns Hopkins in Baltimore.

"Haptics," as it is called, refers to the ability of people to sense the world around us through touch. Haptics is to touch as optics is to sight. "Haptics technology" refers to our ability to capture and transmit the vast array of information we get from feeling our three-dimensional world, the way cameras and screens feed information to our eyes.

Okamura's operation is part of the Hopkins robotics lab, a handsome space strewn with marvelously engaging objects. Not the least of these are the human skulls -- what's up with those?

Okamura takes command of an experimental surgical robot that could help operate on your eyes. "This eliminates tremor," she says, maneuvering the robot's business end over the eye socket of one of those skulls. "If I shake, it holds me steady. I can force it to make me move very slowly and deliberately, so it makes me extremely accurate. You can go in and puncture vessels in the eye. For macular degeneration. People like to inject declotting drugs and things like that directly into the vein. In the retina."

Pause.

"Everybody has a different thing that freaks them out. Apparently this is yours," she says.

Robot-assisted surgery has been around for some time, but the surgeon usually stares at a screen to see where the scalpel is going. The way to achieve superhuman steady hands, Okamura explains, is by engaging touch. Computer-mediated feedback makes one's hands feel as if they are maneuvering through goo. Or, say, if you want to peel a very thin membrane off the back of the retina but you don't want to puncture the retina itself. Virtual feedback can guide the surgeon's hands.

Aren't surgeons wary of robot assistance invading their turf?

"Not here," says Okamura. "We've got the best in the world. They're real cowboys. They want the newest and best. They come down here and try the stuff out, and no matter what we're working on, they say, 'Can't you make it feel more realistic?'

"I look at them and say, 'That's my life's work.' "

Meanwhile, Kathryn Smith, the undergrad observed messing around with people's feet, wants to know if she can record the feeling of something and convey it to your brain through your skin's sense of touch, the way a microphone and speakers can pick up sound and engage your ears.

When you feel the difference between a sheet of notepaper and a sheet of sandpaper, it's because you're judging which causes your skin to vibrate more. Those skin vibrations are what your nerve endings pick up, causing your brain to read "rough."

Can a prosthetic hand be made to feel? The haptics lab's fingerlike probe can pick up the vibrations caused by rough surfaces, but how do you get that useful information to the brain? Suppose you connect the probe to what amounts to a sophisticated vibrator not unlike the one that drives an audio speaker. Suppose you place that at a sensitive part of your body -- such as your foot. Would your brain be able to use the nerve receptors there to read the roughness signal correctly?

Our touch is also exquisitely sensitive to temperature. How would you make a computer convey that? Put your hand on one of the concrete uprights on the lab's wall. It's cool to the touch. Then put your hand on a metal chase. It feels colder. A wooden desk feels warmer. Actually, they are all the same -- room temperature.

The metal "feels cold because the heat rapidly moves from my hand into the object," says David Grow, one of the haptics lab's grad students. "But it's no colder than the concrete next to it," says Okamura.

How do you teach a computer to tell your brain all that?

Okamura offers her cowl-neck sweater as an example of the difficulties. It looks like silk, but feels a little like nylon. Sure enough, the label shows plenty of silk, but also 12 percent nylon. And a little spandex. The brain's ability to process touch is an astonishing thing.

There are far more females in the haptics lab than is typical of mechanical engineering departments. Why is that?

"Part of it," says Okamura, "is that I encourage them. But it may not be too far-fetched to think that females are drawn to the idea of engineering touch. The reason I'm in mechanical engineering is that I like to put stuff together and make it work. But haptics also slops over into fields like physiology and psychology -- it's grounded in the business of figuring out exactly how humans tick. Psychology, as a field, is loaded with women."

Good Vibrations

Pricey niche products were the first to offer touch feedback -- high-end Mercedes-Benzes and BMWs, medical devices, the Wii controller and some casino and bar-top games. But now more than 35 million touch-feedback cellphones have shipped. Most of them far exceed in sophistication the mechanical spring-loaded screen of the BlackBerry. One, from Samsung, can transmit a reasonable semblance of a beating heart. That's why it's time to start envisioning what our world will be like when every smart object routinely interacts with the brain this novel way.

The game world has shown how touch can be integrated with vision and hearing. When you "hit" a tennis ball with a Wii controller, not only are your eyes on the screen, but when you "connect" with the virtual ball, triggering the vibration that fires your touch nerves, the device sounds a resounding thwack. Arguably, it's the sound that really has you thinking you've hit a tennis ball, not a baseball. But, as in life, it's the combination of senses that your brain processes.

"So how do we move from wow and games and the fun part into practical business tools that you can't live without?" asks Chuck Joseph, general manager of the touch interface products group at Immersion. He's been through this sort of thing before, helping transform global positioning from something only the military had into so much a part of our lives that "now kids have it in their shoes."

He remembers getting the attention of the CEO of a multibillion-dollar company by taking a sophisticated surveying tool and making it something you could understand without looking at it. "It has touch-screen, but when the surveyor is walking around looking at that screen and trying to touch it, he's tripping, he's falling, he's got a backpack on, he's got an antenna at the end of the pole." So Joseph's crew transformed it into something like a touch Geiger counter. The closer the target, the stronger the vibration.

Warm, warm, warmer, warmer, hot, hot, hot.

"Imagine that coming into your friend-finder," Schaeffer says. "Teenagers at the mall. Or you're trying to figure out where you're going and sometimes you can't hear on a busy street corner. So your GPS can have that feeling to turn left or right, or keep coming.

"As a mom, you can have messaging and alerts that feel different. I'll know it's my son, even if I have my sound off. And I'll know what priority it is. If this is an SOS, I would walk out of this meeting to take the call. It could feel like whatever we wanted to make it feel like -- a heartbeat."

A Kiss Is Still a Kiss

Immersion employs people called "haptic artists" who build touch effects. "It's just like composing music or painting a picture. It's the creation of feeling," says Schaeffer.

Can you transmit a kiss?

"We can transmit a slap," she says. "That's one of my favorites, for when you get 'I'm coming home late.' "

What about sex?

Very long pause.

"Well, that's not a current research program, I can just tell you that," Schaeffer says.

May be missing a bet.

Last year an impressive range of serious publications, including this one, respectfully reviewed a book titled "Love and Sex With Robots: The Evolution of Human-Robot Relationships," by David Levy.

"Levy's thesis isn't as silly as you might initially think," The Post's reviewer wrote. "Technological advances will someday be complemented by cultural changes, and cavorting with robots just won't seem weird anymore."

Why is it important to humans that machines are beginning to touch us back?

"It was incredibly important to humans when robots started to look at you, recognize a face and make eye contact," says Sherry Turkle, a psychologist, author and director of the MIT Initiative on Technology and Self.

"The eye contact turned out to be a significant Darwinian button. We are hard-wired for that. That's how we sense the presence of an other. Same thing with touch. That is the way we connect with an other that knows about us, that understands us. It is in our evolution. We are hard-wired to communicate with each other by touch. It's how we stroke babies, how we want to be comforted. . . .

"A heartbeat is a powerful way of signaling the presence of another human soul."

View all comments that have been posted about this article.

© 2008 The Washington Post Company