We hope our tone is clear here: We don’t need this kind of criticism from a computer. The Halo collects the most intimate information we’ve seen from a consumer health gadget — and makes the absolute least use of it. This wearable is much better at helping Amazon gather data than at helping you get healthy and happy.
Since August, the Halo has been listed by Amazon as an “early access” product that requires an “invitation” to buy. (It will cost $100 plus a $4 monthly fee once it’s sold widely.) We’re reviewing the Halo now because Amazon’s first digital wellness product offers a glimpse of how one of tech’s most influential companies thinks about the future of health. And what could be better to do when we’re lonely during a pandemic than have an always-listening device point out our flaws? Amazon chief executive Jeff Bezos owns The Washington Post, but we review all technology with the same critical eye.
Late to the fitness-tracker market dominated by the Apple Watch and Fitbit, the fabric-covered Halo has no screen, no sounds or vibrations and no design innovation. Like its competitors, it contains sensors that monitor physical activity, heart rate, sleep and skin temperature. A companion phone app is the only place to see what it has learned.
But the Halo pushes into uncharted territory by also collecting new, unabashedly invasive kinds of personal information — including body photos and voice recordings — and then feeding it into Amazon’s software for analysis. They’re going for AI doctor, or at least life coach.
One reason we tested two Halo bands, one on each of our bodies, is to see how well the AI could account for gender and other important human differences — particularly on factors as complicated as fat composition and tone of voice. Spoiler alert: It described Geoffrey’s tone with words like “opinionated” while it was more likely to flag the tone of Heather, a mom of two, as “dismissive” and even “condescending.”
“We can bring some unique expertise in AI and machine learning,” Amazon medical officer Maulik Majmudar told us in an interview. “There are many examples of this in the product you see, but the specific ones that could come to mind are body [fat analysis] and tone — that’s a more comprehensive and holistic view of health than just physical health alone.”
We’re also believers, in the long run, that personal data might be able to help people get healthier or even detect diseases like covid-19, the subject of a flurry of recent research. Over the past decade, wearable tech hasn’t made much of a dent in the United States’ growing obesity rate. Our Fitbits and Apple Watches don’t really know how to turn mountains of body data into actionable insight and behavior changes.
Amazon’s problem is, the Halo does it even worse.
More data, if you believe it
On the fitness tracker basics of measuring activity and sleep, the Halo is more erratic than its competitors. In a seated, side-by-side test, the Halo’s heart rate readings are similar to an Apple Watch 6 and Fitbit Sense. But during a bike ride, the Halo reported a peak heart rate of 129 bpm, while the Apple Watch reported 171 bpm. Part of the problem is there are no buttons to tell the Halo you’re about to exercise — it just tries to figure out for itself what you’re up to using its algorithms.
You also have to trust Amazon’s AI to accurately estimate other body measures, starting with a near-naked total body scan. The health baseline most doctors use is body mass index, or BMI, a score based on height and weight. Amazon says a better measure is body fat percentage, which it calculates by asking you to stand in front of your phone’s camera in your skivvies for a 360-degree photo shoot and then sending the shots to Amazon’s cloud for analysis.
Evaluating the accuracy of Amazon’s fat measurement is difficult without a doctor or dietitian. One home device that calculates fat by sending a light current through your feet, the Withings Body+ scale, reported Geoffrey’s body fat was five percentage points lower than Halo’s reading. Heather’s fat estimate was nearly identical on the Withings scale and Halo. (We’re not going to tell you exactly how out of shape it thinks we are.) Amazon claims its AI is more accurate than the technology in smart scales, although it has not been cleared by the Food and Drug Administration.
The Halo’s voice tone analysis is questionable on a whole other level. You train the device to recognize your voice by reading sample phrases, and then it listens out constantly for moments in conversation that go beyond your neutral tone. (There is a button you can press to temporarily turn off the microphone.) The Halo plots these moments as positive vs. negative and high vs. low energy, and then applies more nuanced descriptors to them — for example, a voice that registers as negative and low energy might be classified as “discouraged.” You can review a dozen, or more, of these per day in the Halo app.
How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI by having people categorize voice recordings. The company held internal trials and says it tried to address any biases that might arise from varying ethnicity, gender, or age.
In our experience, the Halo could detect ups and downs in our voice, but seemed to misinterpret situations regularly. And some of the feedback feels, ironically, a bit tone-deaf — especially when judging a woman’s voice.
Our sample size of two isn’t sufficient to conclude whether Amazon’s AI has gender bias. But when we both analyzed our weeks of tone data, some patterns emerged. The three most-used terms to describe each of us were the same: “focused,” “interested” and “knowledgeable.” The terms diverged when we filtered just for ones with negative connotations. In declining order of frequency, the Halo described Geoffrey’s tone as “sad,” “opinionated,” “stern” and “hesitant.” Heather, on the other hand, got “dismissive,” “stubborn,” “stern” and “condescending.”
She doesn’t dispute that she may have sounded like that, especially while talking to her children. But some of the terms, including “overbearing” and “opinionated,” hit Heather differently than they might a male user. The very existence of a tone-policing AI that makes judgment calls in those terms feels sexist. Amazon has created an automated system that essentially says, “Hey, sweetie, why don’t you smile more?”
From data to action
Each new generation of wearable gadget has invented new motivational scores to help you get healthier. The Fitbit nudged you to take 10,000 daily steps — and the Apple Watch urges you to close three colorful daily rings for activity, exercise and standing. Amazon’s Halo gives you an activity score with a goal of 150 points per week. Say what?
Amazon says it’s not arbitrary: The score is based on guidelines from the American Heart Association that people should aim for 150 minutes of moderate exercise per week, although the points don’t translate exactly to time. The Halo gives more points for more-strenuous activity, and you lose points for stretches of inactivity.
It is smart to measure activity that actually exercises your heart (which taking steps may not). But putting it on a weekly scale of 150 — and burying your score in the Halo app — means it’s mostly meaningless as motivation in the moment when you have to decide to go for a jog or kick back on the couch.
It’s possible that some people may be more receptive to criticism coming from an app than coming from a person. The Halo app also turns your body scans into a 3-D rendering, complete with a slider to see what you’d look like with more, or less, fat.
But we were disappointed that even with all that highly personal (somewhat naked) data, the Halo doesn’t offer any kind of a personalized plan or path forward. The app has what it called Labs, a hysterically techy term for what are mostly just videos or audio recordings from outside companies such as Orangetheory and Openfit, with little reminders to watch them. The selections are the same for everyone, not tailored to your data or goals.
Even more perplexing is that Amazon thinks there may be anything motivating about its harshly worded reports about your conversational tone. “For the most part, people are relatively unaware of how they sound to others and the impact that may have on their personal and professional relationships,” Majmudar said. But he also told us that Amazon is purposefully not providing any kind of diagnosis or interpretation of a person’s emotional state.
It’s hard to even match up the Halo’s daily AI tone judgments to what you actually said — you get a time stamp, but not a transcription. Was that panicked, overwhelmed moment tied to a breaking glass or just idle chitchat about coffee?
The Halo app’s Labs offer lectures from a communications expert about “conscious listening.” But there are no personalized suggestions based on your tone, such as how to sound less “sad” in the middle of an isolated holiday season during a pandemic.
The Halo has invented a new personal behavior to feel self-conscious about, which we suppose is a kind of innovation.
All about the data
While reviewing the Halo, we couldn’t shake the suspicion it was just another effort by Amazon to collect more data about customers’ lives.
Amazon approached some aspects of Halo data more carefully than it has other recent products. The Halo does not send Amazon recordings of your voice, like its Echo smart speakers. Instead, it sends recordings to your phone for analysis, and then deletes the recordings from both. Your body fat photos are sent to Amazon’s cloud for processing, then deleted from its systems. (You can choose to keep a copy on your phone.)
But that still leaves open plenty of other ways for Amazon to profit from your information. In an anonymized way, it can data mine the heart rate, activity, sleep and tone patterns of Halo owners, using the information to tailor its health algorithms and learn about human bodies. Make no mistake: Disrupting medicine is the next goal for big tech.
Those ambitions are just hard to square with the half-baked product the Halo is today. But half-baked products are not totally out of character for Amazon, which has a history of tossing out weird, creepy ideas and letting customers do the testing. The first Echo smart speaker was not very useful, until devoted customers fed Amazon lots of data to train its talking Alexa assistant. Amazon also has killed off other wacky products, including a camera called the Echo Look that used AI to judge fashion.
AI-powered devices such as the Halo may well be the future of health. But they are not the present.
Read more consumer tech reviews and analysis: