(Courtesy of the Journal of Neuroscience)

The idea of a device that can materialize one’s memories out of thin air seems like it could only exist in science fiction. But in a new study, researchers were able to pretty accurately sketch out the thoughts of participants simply by scanning their brains. It’s helping scientists understand how memory works in the human brain, and it may be a first step toward the futuristic ability to read minds.

Researchers at the University of Oregon showed a group of participants, all strapped into an MRI machine, a series of photos of human faces. They followed the participants’ brain activity as they looked at each image, mapping neural activity to a code of numbers that correspond to the characteristics of each face. This part of the experiment accounted for the machine-learning function, measuring how the brain registers the numbers assigned with each new face.

Combined activation activity across all participants in the study. The image on the left represents the brain visually processing each image while the brain on the right is the memory attempting to recreate the image. (Courtesy of the Journal of Neuroscience)

Then, each participant, still in the MRI machine, was shown a picture of a new face. Using what they had observed, researchers programmed a computer to reconstruct the face based on the neural activity of each participant. The result? The computer sketch bore a good resemblance to the new image seen by the participant. Researchers determined the computer’s accuracy by showing the original image and the computer sketch to a new group of participants and asking them to compare the images, answering questions like, “Is the face male or female?” or “Is the face’s skin light or dark?” Participants’ answers were by and large the same for the two images. The computer’s new sketch reflected the face processed in the earlier participant’s brain.

The OTC — occipitotemporal cortex, which handles visual inputs — set of images represents the machine’s output when the participant was actively observing the picture. The ANG — angular gyrus part of the brain, which processes memory retrieval — set shows the memory-reconstructed version. The five images on the left are the most accurate recreations, while the two on the right are the least accurate. (Courtesy of the Journal of Neuroscience)

Participants were then shown two images of faces at once and told to pick one to picture in their minds. The images were removed, and the computer program scanned their brain activity, attempting to re-create the face in their memory. The resulting sketches were less accurate than the first experiment, but researchers measured the pixel value of the re-creation vs. the original and found it to be accurate 54 percent of the time.

“It works better than chance, but just by a little bit,” said Brice Kuhl, one of the researchers. “Our study provides a proof of concept that patterns of brain activity can be translated into visualizations of specific faces, but the accuracy of our method is not high enough that we can have confidence in any particular visualization.”

Researchers Kuhl and Hongmi Lee said the study revealed compelling information about how the brain processes memory.

“We know this brain region ‘lights up’ when people remember something, but there has been a lot of debate about why it lights up,” Kuhl said. “Our major finding was that patterns of activity within this brain region carry information about what people are remembering.”

A deeper understanding of the brain’s relationship to memory could be helpful to studying Alzheimer’s or other memory disorders down the road.

But what about the mind-reading aspect, the idea that this machine drew images from people’s memories? Kuhl and Lee are hesitant to call their experiment true mind reading.

“Certainly, we think it is exciting to see things in science now that used to be only in science fiction,” Kuhl said. “But we will leave it to others to dream up ways in which these methods can be used for other purposes.”