As 2012 winds down, lots of people are looking back at the year in tech. But at IBM, researchers have released a list of trends to expect not only in 2013, but in the next five years.
On Monday, the company released its annual “5 in 5” report, which offers up predictions about what technology innovations will catch on in the next half-decade. This year, the report focuses on how computers will process information in the future, and IBM’s researchers say that nature’s gift of five senses won’t be reserved for just the living: Machines may actually be able to process things as humans do — through touch, taste, sight, sound and smell.
That, said IBM vice president of innovation Bernie Meyerson, would be a major shift in the very architecture of computing.
“If you program a computer, it’s a gruesome undertaking,” said Meyerson, noting that — at its most basic level — the way humans load information, bit by bit, into computers, hasn’t changed since the abacus.
But advances in computer technology, Meyerson said, are already allowing computers to look at an object holistically, taking in information in a moment that would have taken years to input through code.
“Say you’re standing in a museum of modern art, surrounded by paintings and sculptures,” Meyerson said. “You would spend the rest of your adult life trying to put that into words and type it in [to a computer]. Now, imagine if you could teach it by just showing it something.”
The idea, Meyerson said, is to give humans and computers a common language. And it’s not as difficult — or as futuristic — as you may think.
Smell and taste, Meyerson said, are two senses that have a clear chemical base. If computers can sense the types of molecules — ammonia, explosive residue or gasses that indicate decay — they could alert users to different markers that would flag security risks or food-borne illnesses. The same is true of taste, he said, if computers could be programmed to recognize the correct proportions of certain chemicals. Or, the machines could be used in health planning, to find healthy combinations of foods that would appeal to the palate of the dieter.
When it comes to sight, Meyerson said, researchers have improved recognition software that can identify objects based on a database of images already loaded into the system. And in the future, computers could “hear,” by using detailed sound analyses that, for example, can tie a certain pattern of notes in a baby’s cry to anguish or joy.
Finally, computers could learn to tell the difference between cashmere or concrete by reading the appropriate signals of vibration and temperature, Meyerson said. Video game makers have already used a very basic version of this: controllers vibrate when there’s impact between objects on-screen. In the next five years, researchers could take that sort of program to a microscopic level, allowing machines to have some sense of touch, Meyerson said.
While each idea has applications of its own across many industries, Meyerson said that they would have the greatest impact when combined.
“It’s not that you want to make computers smarter than humans,” he said. “But they have bandwidth to get it in... If you want to scale its memory, you can buy a box of disk drives.”
Sign up today to receive #thecircuit, a daily roundup of the latest tech policy news from Washington and how it is shaping business, entertainment and science.