First there were phones with screens you could touch, then came ones that listened for your voice and knew your face. Now Google is literally bringing a new dimension to phones: awareness of how you move all around them. It just has to demonstrate why we would want that.
At an event in New York City, the online search giant unveiled its fourth homemade Pixel smartphone, along with new Nest smart speakers and WiFi routers, a laptop and wireless Pixel Buds headphones. The products might pique the interest of Android phone owners who like Google’s software and services, though I suspect it won’t win over many loyal owners from Samsung and Apple phones.
Still, you’ve got to give Google credit for keeping the competition interesting. Google has an unusual role in the gadget world. It already makes the Android software that runs most of the world’s phones made by other companies, including Samsung and Huawei. But in the last three years, it has spent more than a billion dollars to become an incubator of its own hardware, too. Its goal has been to explore new consumer applications for its research into artificial intelligence and other technologies.
Google has made the biggest impact in smart home products, where its talking Assistant AI competes with products containing Amazon’s Alexa. Google’s edge in this space is better, more colorful designs, and friendlier software and privacy settings. On Tuesday, Google introduced a new Nest WiFi router ($270 for two pieces) that includes a speaker and microphone to access Assistant. And a new version of its $50 Nest Mini smart speaker promises improved sound quality and faster response times. (Amazon CEO Jeff Bezos owns The Washington Post.)
Smartphones are a bigger deal in our lives, but have proven much harder for Google to crack. After a decade of smartphones, people are upgrading high-end ones less often — and largely locked into brand loyalties.
This year’s Pixel 4 and Pixel 4 XL, which begin shipping October 24, are the first phones Google developed with its $1.1 billion acquisition of Taiwanese phone maker HTC. It catches up with Apple by ditching the fingerprint reader for a facial-recognition unlock system. (Unfortunately, Google already made a mess of this: one of its contractors was found paying darker-skinned homeless people to scan their faces to improve the tech.) Google says it will also begin selling its own competitor to the AirPod wireless headphone competitors, the $180 Pixel Buds, next year.
The Pixel 4 promises deeper tie-ins to Google’s services, particularly Assistant. The AI now responds more quickly because the phone processes some commands locally without first sending them over the Internet to Google. And a new voice recorder app transcribes what it hears as it saves it, making it easier to search.
The Pixel 4’s camera will likely remain the biggest draw for many. It now features two back lenses, including one for zoom shots. Though Apple and Samsung have phones including three back lenses, the software that runs the Pixel camera — tuned to turn out shots we’ll find appealing — still feels about a year ahead of the competition. Last year’s Pixel 3 added a standout “Night Sight” mode for lowlight situations. This year it adds the ability to take photos of the stars.
It’s the hand-waving capabilities that Google is hoping will set the Pixel 4 apart among customers seeking out the most cutting-edge tech. Dubbed “motion sense,” the phone contains a miniaturized radar that’s constantly looking for motion in an invisible “bubble” around itself within a radius of about three feet. Dubbed Project Soli,
it was developed over five years by an advanced technologies skunkwork team within Google, and uses little power so remains active even when the phone’s screen is off and otherwise inactive.
Uses for motion sense are, so far, limited. They start with gesture controls: Wave your hand over the phone to advance a song — or wave it the other way to go back. You can also wave away ringers and alarms to silence them. That’s helpful when you’re … perhaps, driving? Or maybe eating Cheetos? Well, at the very least it’s useful to impress other geeks.
Some other phone makers such as LG have offered gesture-control capabilities using their front-facing cameras. Google says its radar goes much further by extending to the phone an awareness of what’s going on around it. Knowing there’s a person around it makes the Pixel 4’s screen turn on (or off) and reduce the volume of rings when it senses a hand reaching for it. (This makes it more “polite,” says Google.) It also starts firing up the phone’s facial identification cameras to unlock more quickly.
A phone that’s more “aware,” to use Google’s term, also sounds a bit frightening in a world where data about our lives has become Google’s most profitable commodity. Google says motion sense can’t tell the difference between specific people, at least not yet. (How long until there’s a murder, where the evidence that there was someone else in the room will be a Pixel’s radar?) The motion data gets processed on the phone, and isn’t saved to your Google account or shared with other Google services. And other app developers can’t access the sensor, so far.
It’s hard to tell where this idea will go, but it’s a pretty big leap to think a phone can now know where you — or something else important — is located. Perhaps this sort of data will become useful for future augmented-reality applications.
Entering its fourth year, the Pixel remains somewhere between a curiosity and an existential threat for Apple and Samsung. But adding a new category of sensor to the already mature smartphone is a bold, possibly crazy move we don’t see often from hardware makers these days.
So for that, I’ll wave a high five to the Pixel 4. I know it’s watching.
Read more tech advice and analysis from Geoffrey A. Fowler: