Cognitive psychologist Mary Czerwinski and her boyfriend were having a vigorous argument as they drove to Vancouver, B.C., from Seattle, where she works at Microsoft Research. She can’t remember the subject, but she does recall that suddenly, his phone went off, and he read out the text message: “Your friend Mary isn’t feeling well. You might want to give her a call.”
At the time, Czerwinski was wearing on her wrist a wireless device intended to monitor her emotional ups and downs. Similar to the technology used in lie detector tests, it interprets signals such as heart rate and electrical changes in the skin. The argument may have been trivial, but Czerwinski’s internal response was not. That prompted the device to send a distress message to her cellphone, which broadcast it to a network of her friends. Including the one with whom she was arguing, right beside her.
Ain’t technology grand?
Czerwinski is working in affective computing, which emerged in 2000 from the laboratory of Rosalind Picard at the Massachusetts Institute of Technology. Picard and her colleagues dreamed of creating caring robots. As a first step, they decided to make machines that could detect and help us cope with our sometimes hidden emotions.
One of Picard’s early projects involved helping autistic children. Because her devices were often better than the children themselves at communicating their feelings, she designed ways of feeding information from a wrist sensor to the cellphones of parents and other caretakers so they could know about the stress their children were under and respond accordingly.
“The sensors have provided data that has allowed people to see things that they were guessing about, and often guessing wrong,” Picard says. For example, an autistic child might be lying on the floor looking lethargic, but the signal from his wrist sensor shows that he’s very tense. He’s probably lying on the floor to ease the tension. “A teacher might not understand this and say, ‘Get off the floor and back to your desk.’ The signal allows the child to be better understood.”
Czerwinski is taking the technology in other directions. One is to create ways for people with less dramatic mental conditions to monitor stress in everyday life. The projects she has developed with her colleagues, including Microsoft senior research designer Asta Roseway, include a butterfly-shaped set of wires attached to a sensor wristband. The “Mood Wings” beat at different rates depending on the wearer’s stress. The developers also made a jacket that resembles a chain-mail vest whose bendable, wired “leaves” use 40 motors to flap when the wearer is happy. To show stress, similar motors on the back of the vest literally raise its hackles. Another of their inventions is a fabric device that hangs on the wall; it turns crimson and sets off a fan when it picks up stress from a user’s wrist sensor.
While these gadgets are prototypes that probably won’t go beyond the novelty stage, they represent the kind of machines that, by helping people become more tuned in to their emotions, could allow them to be more self-aware and develop strategies to improve their lives, Czerwinski says.
“Some people say, ‘Why would I want you to know how what I’m feeling?’ ” she says. “But there are audiences that could benefit because they have communication gaps. Autistic kids could benefit from fabrics that show feelings. Or in PTSD — could we warn family members that there’s going to be an eruption? Something as fanciful as these devices could expose that.”
Czerwinski and her colleagues have developed a way to help parents of children with
attention-deficit hyperactivity disorder respond constructively to their children’s difficulties.
A parent wears a wrist sensor while caring for the child. Via a wireless computer system, the sensor relays signals of parental stress to a network. In reply, the network sends text messages to the parent’s cellphone that suggest helpful behaviors or messages. These are “standard things like ‘Maybe you should take a walk,’ ‘Try some deep breaths,’ ‘Try not to shout’ — the kind you find in parenting books,” Czerwinski says. “But most parents aren’t looking in those books when they reach the end of the tether. This can provide help at the instant it’s required.”
Although one can easily imagine a frustrated parent slamming her intrusive cellphone against the wall, Czerwinski says that, when tested, the system helped parents use the right approach with their kids.
A more elaborate system that Czerwinski’s team has tested tracks a person during a day at the office using a wristband and chair sensors, facial recognition technology, voice recorders that are said to detect state of mind and GPS data. It maps out the user’s levels of excitement, stress and anger throughout the day with color- and shape-coded blobs on a monitor.
“You don’t need this system to remember a real trauma, but [it identifies] the micro-patterns of badness in your day that you don’t know are happening but may put you in a bad mood. Maybe there’s a person in your life who makes you sad or angry or bored. Over a year, more critical patterns could be detected,” Czerwinski says.
Her team plans to make keyboards sensitive to how hard they’re being struck, another possible indicator of mood. “We think about [having the keyboard send] a message — ‘Don’t send that e-mail!’ — when you’re typing real hard,” she said.
Czerwinski doesn’t have much evidence of the efficacy of her technology, nor does Picard. But both say that their devices have given them surprising clues about their own states of mind.
Picard has been wearing one or more sensors for several years. This has given her a deep sense of where subtle mood swings can be felt, she says. “You realize that these devices measure a feeling you’ve had your whole life that you don’t know the name of. It’s the feeling of your hands getting sweaty before something. It’s the feeling of something that matters happening.”
On a day when she was taking her teenage son and friends out for a birthday celebration at an amusement park, she expected her stress to peak on a roller coaster. In fact, the peak hit at her house in the morning, when a hitch came up and she was afraid she’d have to cancel the party. “You want so badly to succeed and make your child happy,” she says. “It reflects a deep-seated concern we have about situations that are out of our control.”
Picard and an MIT colleague, Rana al-Kaliouby, created a company, Affectiva, that marketed the wrist sensors for a few years before shelving them in favor of a second system they invented, which uses cameras to identify emotional states such as surprise, dislike and embarrassment using fleeting scans of the face. The MIT scientists have recorded more than a billion expressions from thousands of people and fed them into a computer algorithm that gradually grew adept at interpreting a given expression on anyone’s face.
Picard said marketing companies have jumped at this technology, which they use to detect nonverbal signals from consumers, either in focus groups or via webcams. Affectiva’s clients include Unilever and Coca-Cola, as well as some major Madison Avenue firms. In a focus group that Picard cites as an example of the technology’s value to marketers, rural women in India spoke of the immodesty of a body lotion ad in which a husband put his hand on his wife’s midriff. But the women’s facial expressions while the ad screened suggested they liked it. The company went ahead and ran the ad.
Czerwinski’s machines are not geniuses at figuring out how to make you feel better. The interventions are the same that cognitive psychology has offered for decades: Go take a walk. Stretch, take a deep breath, call a friend, play some music or a video game. The objective, Czerwinski says, is to increase mindfulness, or awareness of how we feel in particular circumstances.
William R. Hersh, a physician and director of informatics at the Oregon Health Sciences University in Portland, recently heard Czerwinski speak about affective technology but was not convinced the measurements would be of any particular therapeutic use.
Not all stress is bad, he pointed out, and he wasn’t eager to have yet another technological fix for a human problem. “I’m one of those people who is already wedded to technology, and the thought of being wedded to it another way seems intrusive,” he said.
The people developing affective devices empathize with some of this. “We all have the same uncomfortable feeling that we’re headed into this frontier where things are going to know more about us than we’re comfortable with,” Roseway says. “We feel this in our guts, but what does it mean? Part of our work is to try to make sense of where we’re headed, so we can share some solid learning about it. We’re like everyone else, riding the waves and seeing where we’re going.”
Like it or not, the technology is coming at us, she says: “In the future, we may not even see the sensors we’re wearing. It’ll be integrated into our person.”
“They’ll be embedded,” Czerwinski adds.
Allen is a freelance writer and author of “The Fantastic Laboratory of Dr. Weigl,” which will be published in July.