For Apple, more AI and more integrations with third party services will mean less fatigue for consumers, who are already overwhelmed with too many apps, too many devices, and too much data. Ultimately, artificial intelligence behind-the-scenes could make it easier for users to organize their ever-growing photo collections, communicate and use online services more efficiently and toggle less between devices. The moves also come at a time when tech giants and a wave of new start-ups are racing to create similar artificial-intelligence based products.
For example, Apple will now scan your photos using facial recognition to cluster people together in your photo collection. If pictures of grandpa are scattered across your photo collection, Apple will now find them through facial recognition and group them together so you can organize your memories without having to sift on your own. Facebook has had automatic facial recognition for several years. Another Apple feature, called Memories, groups people together by location, and even shows where photos were taken on a map. This stuff isn’t simple: Behind the scenes, the software is doing 11 billion computations on each photo to make this happen, Apple said.
Apple’s other AI announcements borrowed heavily from Cortana, Microsoft’s voice-based virtual assistant that works in Windows. Executives said that Siri is now coming to desktop computers: Soon consumers will be able to talk to your Mac computer just the way that you talk to your phone.
Overall, more consumers are talking to technology: Google recently said that roughly 20% of all queries are initiated by voice rather than typing. While Cortana has enabled users to talk to their desktops for more than a year, Apple has the advantage of being able to integrate a popular tool on mobile onto desktop, making the experience of moving between devices more seamless. Like Cortana, Siri will also now scan people’s communications and make suggestions. If the system sees two people discussing a meeting over text message, a calendar icon will pop up, enabling the users to schedule the meeting from within their texting thread. Apple will also suggest relevant emoji, which will get bigger and more interesting in the new operating system. (Another handy feature: You can also tell Siri on your computer to send a text message to someone's phone).
In opening major applications to third parties, Apple is nodding to a growing view in Silicon Valley that consumers are seeking an alternative to toggling between the dizzying number of apps they store on their phones. They want to call an Uber or a Lyft, for example, without having to open an app; now outside developers will be able to build those services directly into Apple’s messaging platform. Facebook recently launched a similar feature inside its popular messaging app.
In addition to having Siri on the desktop, the changes will allow developers to supercharge their apps with Siri's voice. Users will soon be able to use Slack, Uber, or Skype, by talking directly to Siri. This widely anticipated move takes a page from Amazon. For some time now, the company's Alexa smart home assistant device has been allowing third parties to build services onto its platform. So consumers can ask Alexa to read out the weather or connect to smart locks. (Amazon founder and chief executive Jeffrey P. Bezos owns The Washington Post.)
While Apple's rivals have made earlier announcements and perhaps bigger investments in AI technology, Apple's delayed approach is to integrate a suite of features more seamlessly into products and to potentially avoid mistakes. Last year, for instance, Google introduced a feature to help consumers automatically organize photos. But the software became infamous because it incorrectly tagged some African Americans as gorillas.
Apple is also trying to reduce app fatigue by enabling consumers to do more within a single application. Want to call an Uber without leaving your text message conversation? You can do that within iMessage. Want to send someone a text message from your computer? Now you can tell Siri. The company is also attempting to relieve some password fatigue by letting you log in to your computer with your Apple Watch, so you don't have to type another password. And Apple Pay will be expanded too, so that it now works on your desktop on a number of e-commerce sites. That means consumers who prefer to use Apple can avoid painstakingly entering and storing their credit card info with a slew of vendors.
In the past, Apple has been resistant, on privacy grounds, to scanning consumer data, but today's announcements are further evidence that the company is gradually moving away from its earlier stance. In his presentation, Craig Federighi, an Apple senior vice president, made a point of saying that new features had been reviewed in advance by a well-known privacy researcher, Aaron Roth. He even flashed a slide with a quote from Dr. Roth in which the researcher gave his blessing to the company's privacy designs for its new products. If Apple is scanning the behavior of many users to suggest what emoji they might want to use, for example, the company will now obscure individual's identities by adding incorrect information, so-called mathematical noise, into the system. (Roth did not respond to immediate requests for comment).
Opening up its platforms to third parties has also historically been a point of discomfort for Apple, as the company's impulse to control the quality and integrity of its own products has butt up against major trends in AI. Those trends emphasize merging more data from third parties to increase the amount of services that can be offered on a single platform. The original Siri included integrations with many third parties that were dissolved after Apple bought the Siri startup five years ago. Today, against a wave of outside pressures, Apple just about came full circle.