SAN JOSE — In 2011, Apple became the first company to place artificial intelligence in the pockets of millions of consumers when it launched the voice assistant Siri on the iPhone.
Six years later, the technology giant is struggling to find its voice in AI.
Analysts say the question of whether Apple can succeed in building great artificial-intelligence products is as fundamental to the company’s next decade as the iPhone was to its previous one. But the tech giant faces a formidable dilemma because the nature of artificial intelligence pushes Apple far out of its comfort zone of sleekly designed hardware and services.
AI programming demands a level of data collection and mining that is at odds with Apple’s rigorous approach to privacy, as well as its positioning as a company that doesn’t profile consumers. Moreover, Apple’s long-standing penchant for secrecy has made the company less desirable in the eyes of potential star recruits, who hail from the country’s top computer science departments and are attracted to companies that publish research.
“Artificial intelligence is not in Apple’s DNA,” said venture capitalist and Apple analyst Gene Munster. “They understand that in the future, every company is going to become an AI company, and they are in a particularly tough spot.”
At Apple’s annual developers conference Monday — the same event where Siri was introduced — the company's efforts to become an AI powerhouse were on display as executives launched a new stand-alone smart speaker and touted features meant to boost Siri’s chops and to power AI applications on Apple products.
“Machine learning” — an AI buzzword that describes a form of ultra-fast, complex computer data analysis and statistical modeling — was repeated throughout the 2½-hour presentation, delivered to an audience of roughly 6,000 developers here. Siri will now use machine learning to predict the times of a morning commute, or scan the travel news as you are reading it on the company’s Safari browser and then suggest related activities, such as booking a reservation.
She will use machine learning to talk with you and help you sort through music through a new $349 home automation device, the HomePod. She will automatically organize your photos into albums, such as “2nd Anniversary,” without you giving her any context about the pictures. There was even a new software tool kit, Core ML, that will allow for faster processing of large amounts of data collected during machine learning applications. (It’s six times as fast as Google’s rival AI processor, an executive quipped.)
But Monday's announcements come as other technology companies have released similar innovations and have already spent billions on the burgeoning AI arms race. Many are placing their bets on artificial intelligence — software that one day may be smart enough to chat back and forth like a human, or computer vision that identifies real-world objects so well it can power the first fully functioning self-driving car.
That has put Apple in the disadvantaged position of trying to lead in an area where it has fallen behind — and where the effort cuts against core aspects of the company’s secretive culture.
“This is a substantial shift for Apple,” said Daniel Gross, a former Apple executive who focused on artificial intelligence. “The internal focus is on building great products, not publishing papers.”
Ten years after launching the iPhone, Apple is on the hunt for another blockbuster product that can take its place. Sales of the iPhone propelled Apple to become the most valuable company in the world and still account for more than half of the company’s revenue, which was $215.6 billion in 2016. But last year, purchases of the company’s smartphone dropped for the first time, suggesting that the market for high-end smartphones may finally be saturated.
“The difference between last year’s iPhone and this year’s iPhone is going to be much smaller than the difference between the first and second iterations of the iPhone. . . . So that S-curve is starting to flatten out,” said Benedict Evans, a mobile expert and partner at the Silicon Valley venture capital firm Andreessen Horowitz, using an industry term for exponential increases in innovation. “Then you have the next transformative technology that isn’t here yet.”
Apple did not respond to repeated requests for interviews and comment.
Apple’s launch of he HomePod, which will go on sale later this year, comes several years after Amazon.com and Google released their own automation devices, the Echo and the Google Home, which send consumers’ spoken queries back to their servers in California for analysis. Google, Tesla, Uber, and others have been testing self-driving vehicles on public roadways for several years; Apple received its first permit to begin testing just two months ago. Before Monday’s updates, the Siri assistant was still mostly a glorified Web search that could tell the occasional preprogrammed joke.
Now Apple is racing to catch up. Last October, the company hired Russ Salakhutdinov, a Carnegie Mellon professor whose expertise is in an area of artificial intelligence known as “deep” or “unsupervised” learning, a complex branch of machine learning in which computers are trained to replicate the way the brain’s neurons fire when they recognize objects or speech. Salakhutdinov is a protege of Geoffrey Hinton, perhaps the world’s top researcher in this area. Salakhutdinov divides his time between Carnegie Mellon and Apple. Hinton divides his time between Google and the University of Toronto.
Building ties to academic superstars not only helps to improve products but also becomes a key recruiting tool, said Richard Zemel, director of the Vector Institute for Artificial Intelligence and a professor specializing in machine learning at the University of Toronto.
“You used to not see people with Apple name badges at conferences, and now you do,” Zemel said.
In December, Apple presented and published its first academic paper on artificial intelligence at an industry conference. Another paper has been accepted to a computer vision conference and will be published in July, Salakhutdinov said in an interview. Salakhutdinov said he was not authorized to discuss his work at Apple in any detail.
Zemel, who told Bloomberg two years ago that Apple was “off the scale” in terms of secrecy and that such secrecy was keeping it out of the loop on major developments in the field, now said that the Cupertino, Calif., giant was “making some changes.”
“But it’s going to take some work,” he added.
Researchers at elite universities said in interviews that Apple was still not the top choice for their computer science graduates — Google, Facebook and Amazon were by far the top picks — but that the company was moving up in the rankings. (Amazon chief executive Jeffrey P. Bezos owns The Washington Post.)
Apple’s forays into AI have also been slower than its peers' because it’s been reluctant to embrace the data-mining practices of rivals Google and Facebook, experts said. The company has spent considerable resources building additional layers of privacy. Unlike Google and Facebook, which are primarily advertising companies that collect massive amounts of intimate data to profile their users, Apple believes in limiting the amount of user data it collects. At a previous developer’s conference, executives bragged that the company did not build user profiles. Chief executive Tim Cook has positioned the company as the anti-Google.
But that stance against data collection becomes a problem if you are building artificial intelligence, researchers say. A home device must collect and analyze people’s speech to improve the way the device can speak to humans, for example. For Siri to be smart, she needs to collect and interpret data from other applications, such as your calendar, your restaurant reservations and, now, your browsing.
Last year, as Apple began to embrace artificial intelligence on the iPhone, the company undertook a large privacy protection project. The project took an academic concept called differential privacy and applied it to AI applications on the iPhone. Differential privacy works by inserting noise — or bad information — into good data to confuse outsiders who might try to hone in on an individual's records. For example, in order for Apple software to group the photos of your dog into a single album, it needs to collect many photos of your dogs.
Apple collects those images, but not before encrypting the data in them and then scrambling that data with other data, so that if anyone tried to recover the original data set they wouldn’t know what was tied to a single user, the company claims. This technique is considered a stronger privacy protection than other methods, such as using mathematical formulas to render user profiles anonymous.
Apple’s focus on privacy may have slowed the company down in terms of building some products, Gross said, but the trade-off would be consumer trust. “Apple is dousing itself with an extra piece of really hard science and doing so to try and preserve your privacy,” he said. “I think Google and Facebook will have to answer to a world where a similar product that is offered is more privacy-preserving.”
Munster pointed out that no tech company has a huge lead on artificial intelligence yet. “The bad news is that Apple is behind,” he said. “The good news is that if we look at how AI is going to impact the world, it’s still early days — there is plenty of time to catch up.”
Correction: An earlier version of this article incorrectly said that Apple co-founder Steve Jobs introduced Siri eight years ago. It also misstated the first name of venture capitalist and Apple analyst Gene Munster. The article has been updated.