Microsoft researchers have enabled elevators in a company building to detect the likelihood that a person walking by will want to board it. The camera in a Microsoft Kinect — positioned in the ceiling — tracked for months the behaviors of people who got on the elevators vs. those who bypassed the elevators on their way to a nearby cafeteria. That data fed an artificial intelligence system, which taught itself to identify the behaviors indicating who wanted to board an elevator and who didn’t. Soon the elevator doors would automatically open as a person intending to board approached.

The project was shepherded by Eric Horvitz, the co-director of Microsoft’s research lab in Redmond, Wash., which houses some of the company’s 1,100 scientists and engineers. Horvitz’s team has begun a second phase of the project experimenting with human-like interactions between elevators and the people riding them.

“Something as stodgy and old-fashioned as an elevator could have really cute gestures and curiosities and say ‘Are you coming?’ with a door motion,” Horvitz explained.

The smart elevator can’t predict with 100 percent accuracy if a person lurking nearby wants to get on. So when uncertain the elevator doors could rapidly jiggle back and forth, as a sign that the elevator is unsure of whether to close its doors or not. Then the elevator could identify a person waving their hand or shaking their head to say “Yes I’m coming,” or “No, I’m taking the stairs.”

Don’t expect to encounter an elevator like this anytime soon in the real world, but Horvitz says his lab has received interest from curious elevator manufacturers. The elevator project is typical of Horvitz’s vision for a better world, in which human intelligence becomes a part of all of the machines we use.

“Everyone will expect things to be a little smarter. It’s the 21st century and we’re still requiring people to jam their legs into doors to tell their the elevator they want to get on,” said Horvitz, who recently served as president of the Association for the Advancement of Artificial Intelligence. He sees world-changing potential as machines learn to think like humans and to understand how humans think.

“We build systems that adore people and that by design come to enjoy and work and support people and nurture them,” Horvitz said. For example, there’s a robot waiting outside the elevator bank on the floor Horvitz’s office is on. If it notices a person is lingering — and likely confused about where to go — it’ll offer directions.

Another Microsoft research project  is a virtual personal assistant. Visitors to Horvitz’s office are greeted by a face on a computer screen. It will recommend whether it’s best to interrupt Horvitz, based upon who he is meeting with or what he is working on. The machine has learned which upcoming meetings Horvitz needs to be reminded of, and which meetings he can remember on his own. If he’s reading or going to a meeting in an atypical location, a reminder is more valuable.

He expects everyone to one day have a virtual assistant that learns about you and exists on all of your digital devices.

“These intelligent assistants, across devices or a single device, will be the next frontier for computing and computer science. It’s the next place where there will be a lot of competitive forces who talk about my intelligent assistant vs. yours,” Horvitz said. To see the personal assistant in action, see this video: