There are devices the elderly can wear to send for help in the event of a fall, but research has shown they aren’t always worn when falls happen. The MIT solution wouldn’t require anything to be worn. The system would learn to identify the person, and could send alerts in the event of a fall.
“We want to provide peace of mind without intruding too much on lives or taking independence away,” Katabi said.
The researchers foresee other uses too, such as for video games, and smart-home sensors. Nest, for example, monitors the movements of residents so that it can adjust temperatures and keep utility bills low.
The system works by sending a radio signal that will bounce off a person and back to the device. These signals travel through walls just as wireless Internet signals do. Data from body parts that curve away from the device won’t be recognized because the signals deflect away from the device. But the device stitches together what data points return to it as the person moves, to create a generally identifiable silhouette.
Their system, however, isn’t eagled-eyed. The images of humans look like heat maps, but in their tests that was enough data for the system to distinguish between five people with 95.7 percent accuracy, and 15 people with 88.2 percent accuracy.
They were also able to identify which body part a person was moving while standing nearly 10 feet away, with 99 percent accuracy.
The team is clear that the technology is still a work in progress. Currently the sensor needs a person to be walking directly at it to function. A person walking at an angle is harder to pick up.
They’ll be presenting their research at a conference next month in Japan. The researchers acknowledge there are related privacy concerns, and say they will consider them as they continue to develop the technology.