A Carnegie-Mellon University student in his early twenties sits in rapt concentration. Eyes are closed. Blond head is bowed. He leans forward, arms perched on his knees. He squeezes his hands together as he focuses mentally on a series of numbers.
"Zero, five, eight," he says. "Four, seven, eight." Then he reels off a rapid-fire group of six numbers: "Three, two, four, one, five, six," followed by another group of four numbers, "seven, six, seven, zero."
On and on, he flawlessly recites a series of 110 unrelated digits that have just been read to him. This continues without interruption for almost two minutes.
The task may seem extraordinary to those who have trouble remembering Zip codes and Social Security numbers. But this student is no genius. He is of average intelligence for a student at Carnegie-Mellon and, until he became part of an experimental training program, his ability to remember was unremarkable.
Today, however, after some 800 hours of practice, he can not only recite these long lists correctly more than 90 percent of the time, but he can also perform another amazing feat: he can repeat the list of 110 digits backwards.
"Highly motivated individuals with average abilities can achieve memory performance levels that most people assume require 'special' or 'exceptional' abilities," says James J. Staszewski, professor of psychology at Carnegie-Mellon and one of the investigators who has helped train and study this student and one other.
"The limits of memory can really be pushed back," Staszewski says. "Our ability to remember information is virtually unlimited, given a lot of practice."
Never before has there been such intense scientific interest in memory. In part, this drive to understand how memory works is fueled by the need to help an estimated 2 million elderly Americans who suffer moderate to severe memory loss as a result of senile dementia. Uncounted others have milder memory problems that may begin to develop as early as their thirties.
"Next to new cancer therapies, the quest for memory-improving drugs is the hottest area of medical research," Fortune magazine reported in a recent article on the drug industry.
Most researchers now agree that some memory loss is probably inevitable with aging. But how much -- and when -- is still not understood, largely because normal memory has been the subject of very little scientific progress until recently. "We literally don't know what normal memory is," says psychologist Thomas Crook, formerly a memory researcher with the National Institute of Mental Health and now head of Memory Assessment Clinics Inc. in Bethesda.
The elaborate mental process we call memory serves as a window into the brain, providing a glimpse of how thought processes work. The goal of memory research is not just to increase the capacity of normal memory but also to gain clues to curing such diverse ailments as dyslexia and mental illness.
"If you want to know how people solve problems, how they think, how they use language and so forth, then you have to know something about the structure of memories," says Nobel laureate Herbert Simon, who studies memory at Carnegie-Mellon University in Pittsburgh. "And you have to know something about how they get access to the memories.
"So what you'd like to know is how big is memory? How fast can you get something in it? How fast can you get something out of it? And what routes do you have to use to get something out of it? Memory research is concerned with issues like those."
Answers to those questions are coming from a variety of places:
*At Carnegie-Mellon, Simon and others are using computers to simulate human memory processing and discover the limits of short-term memory.
*At the National Institute of Mental Health in Bethesda, psychologist Mortimer Mishkin and his colleagues are unraveling the pathways of memory processing in monkeys as a step toward understanding them in humans. Their work points to strong interconnections between the brain's limbic system, the seat of emotions, and memory.
*At Columbia University, Dr. Eric Kandel and his coworkers have identified some of the molecular bases of memory and learning -- the chemicals that allow us to store and retrieve memories.
Few people realize how often they rely upon memory.
"Memory is not just your ability to remember a phone number," says experimental psychologist Lynne M. Reder, who studies memory at Carnegie-Mellon University. Rather, it's "a dynamic system that's used in every other facet of cognitive thought processing. We use memory to see. We use it to understand language. We use it to find our way around. There are lots of different kinds of memory."
There is memory for faces. Memory for names, for words, for feelings, for smell, for taste and for sound. Memory plays a role in knowing how to ride a bicycle, swing a squash racket or simply walk around the block.
Memory also orchestrates the vast number of circuits in the brain that allow attention to many tasks at once. Consider the mental time-sharing required to drive a car through town while carrying on a conversation with a passenger, listening to the radio, and still being able to yield the right-of-way to an ambulance.
The ability to draw on this information stored in the brain is part of the learning process that begins at birth -- some contend perhaps even before birth -- and continues throughout life. "Learning is the way animals and human beings acquire new information," says Dr. Eric Kandel, senior investigator at the Howard Hughes Institute, Center for Neurobiology and Behavior, at Columbia University's College of Physicians and Surgeons. "Memory is the way they retain that information over time."
But how sounds, images and ideas are recorded and retained in the brain is still not completely understood. Clues to the brain's amazing capacity to record information for a lifetime have revealed that "memory is just a collection of isolated bits," says Carnegie-Mellon's Simon.
Short-term memory -- the briefer form of memory in the brain -- can hold about seven bits of information, before either losing it forever or transferring it to long-term memory for permanent storage.
In the brain these information bits are further organized into larger pieces, known as chunks. "A chunk is not any definite number of bits," explains Herbert Simon. "A chunk can contain an enormous number of bits or very few bits. A chunk is any unit that's become familiar to you by previous learning."
The average human brain stores 50,000 to 100,000 information chunks for language alone. A person who is bilingual, or has developed a special capacity for bridge, crossword puzzles or chess, has actually increased the number of information chunks stored in the brain.
"If you're willing to attend to something for eight seconds, it probably gets stored away as a new chunk with some kind of indexing memory attached to it," Simon says. "If you study longer than eight seconds, you probably recover the information more easily later ." By studying something for more than eight seconds, he says, the brain may store the information in more than one place.
One way to understand how memory chunks work is to look at the differences between chess grandmasters and ordinary, weekend chess players. In a series of studies, researchers placed a chess board with 25 chess pieces in front of these two types of players. The pieces were arranged as they would be during a game. Participants were allowed to look at the board for just five to 10 seconds before it was removed.
"Chess grandmasters will reproduce the board almost perfectly more than 90 percent of the time," Simon says. Yet the weekend chess player "will be lucky if he can put six pieces back." But when the same pieces are placed on the board at random, "the Sunday player will get six right, and so will the grandmaster." The grandmaster's ability in the first part of the experiment has nothing to do with special vision or intelligence, but is merely a function of memory. To the grandmaster, the 25 pieces are not just pieces but six or seven chunks of familiar old friends, positioned in well-known patterns, like the Fianchetto King's Castle position, which describes the placement of several pieces. The grandmaster, Simon says, is "just seeing a pattern. Anybody can recognize seven chunks of information in 10 seconds."
"Then you ask, 'Well gee, how many chunks would a chess master have to have in his head so that in fact any reasonable board put before him would consist of six or seven of these familiar friends?' And the answer comes out on the order of magnitude 50,000 -- the size of a natural language vocabulary. So a chess grandmaster has spent 10 years of his life to acquire 50,000 chunks, just as if you are a crossword puzzle fiend you have spent 10 years of your life to learn crossword puzzles."
These experiments also help to explain why chess grandmasters can do other things, such as play 50 opponents at one time, walking from one table to the next, taking 10 or 20 seconds a move and winning most of their games. "They don't play as well as they do in a tournament," Simon said. "But you ask, 'How do they do that?' Well, they do that by walking up to a board, and if they don't see anything very striking, they make what is called a development move, the standard kind of move that can't hurt you and might help. Any chess player knows about those.
"More often after a little while, they'll see something on the board, because the opponent will have made a move that's not as good as it should be. That's an old friend. And when you see an old friend, you not only recognize him, but you call up all sorts of information about him, and this includes information about what to do about him."
To call up this information -- be it about chess, a familiar phrase or a favorite song -- the mind taps its long-term memory. Exactly what the differences are between short-term and long-term memory are still being investigated, but research is revealing diversity not just in storage capacity, but also in how these two memory systems operate and even in their chemcial structure.
For one, short-term memory appears to be more auditory than visual, while long-term memory can be both. "Even when you read, to hold what you read in short-term memory, you translate it, record it as it sounds," says Simon. Short-term memory also seems to operate better when "something is spoken versus being written," says psychologist Mary C. Potter of the Massachusetts Institute of Technology.
Columbia University's Kandel and his colleagues have found that at the molecular level, "short-term memory involves modifications of pre-existing proteins" in nerve cells, while long-term memory "involves the synthesis of new proteins" by neurons.
As for capacity, long-term memory, unlike short-term, has "no upper limit," says Simon. "Nobody has ever proved that they've filled up long-term memory."
Where the divisions between short-term memory and long-term memory physically exist in the brain is not yet known. Short-term memory might even be stored in a separate location in the brain, some researchers speculate.But it may also be that these two memory banks are merely divided by chemical differences. One theory is that neurons might store information temporarily and then later "freeze" them into long-term memory.
In addition to long-term and short-term memory, researchers have also described two other forms of memory known as procedural and declarative memory.
Procedural memory is the memory of habits. Knowing how to brush your teeth, talk or dial an often-used phone number.
Declarative memory involves the more complex learning of tasks and associations. These are the mental exercises that take conscious participation, for instance, remembering that your roommate from college grew up in Rhode Island, now lives in New York, drives a white Fiat and has one son named Oliver.
Many declarative memories rely upon making associations between numerous facts stored in the brain. Often these are organized by context, so that "if you go back to a place that you haven't been for a really long time, all kinds of memories come back to you," explains Carnegie-Mellon's Reder.
Studies of underwater divers showed this phenomenon even more strikingly. Word tests given to the divers on the beach and underwater showed that their memories functioned much better in the same context. If they learned the word on the beach, they remembered it better there. The same finding held true for words learned underwater.
Emotions also play a part in memory. "Simply stated, material learned when one is happy is better recalled when one is happy, and material learned when one is sad is better recalled when one is sad," reported Stanford University's Dr. Jerome A. Yesavage in a recent issue of the American Journal of Psychiatry.
So how is it that your mind can evoke memories so vividly, allowing the recall of sights and sounds and smells of very important occurrences?
This phenomenon occurs because memory is tied closely to the limbic system -- the brain's seat of emotions -- suggests research from the National Institute of Mental Health. Psychologist Mishkin has found that a part of the limbic system known as the amygdala acts as a kind of "multiselector switch" for deciding whether information should be recorded in the brain.
"The limbic system is an incredible system," says Mishkin. "You can see something once, and it can last not just minutes or even hours, but it can last a lifetime." There are times when memories are so intertwined with emotions that information is stored we never intended. The most familiar example is John F. Kennedy's assassination.
"Almost everyone knows where they were the moment that they heard he was shot," Mishkin says. "What difference would it make where you were when you heard he was shot? And yet that's what's remembered. That's what is stored."
When the emotional context is right, Mishkin believes, certain neurotransmitter systems are turned on, and result in a kind of "message print" that records a memory in detail in the brain. In the case of shocking or traumatic experiences, like Kennedy's assassination or, more recently, the explosion of the space shuttle, much more of the circumstances surrounding the information gets recorded.
So when this information is recalled, people remember where they were when they heard the news, who they were with and what was said. All of this appears to take place because of the limbic system.
"We're beginning to think that what the limbic system is all about is bringing something back to mind, something that isn't even there in the environment," Mishkin says.
It allows us, he says, "to go from recognition to recall" -- a feat that is important for evolutionary survival. It is one thing to recognize something -- a parent, a predator or potential food -- and another to be able to recall information. But to be able to associate the two, Mishkin says, is a real biological advantage.
This associative power of the human brain is what gives us the edge over much faster computers. These sophisticated machines can do mathematical calculations in microseconds but stumble on making the vast number of linkages performed by human brains, such as recognizing a face.
Being able to make these numerous associations gives the brain greater flexibility and results in the need for less memory storage space. Were it not for this kind of associative memory, Mishkin says, it would mean on a very simple level that "if one were thinking of bread, one might have to think of butter and would never be able to think of jam or honey."
Every piece of behavior -- including memory and thinking -- ultimately comes down to the action of cells and molecules in the brain. At the molecular level, Columbia University's Eric Kandel and his colleagues have found that memory and learning involve changes in the synapses -- the junctions between nerve cells.
By studying a type of sea snail known as aplysia, Kandel has found that classical conditioning -- the kind of learning/memory demonstrated by Russian investigator Pavlov and his salivating dogs -- involves the production of a special enzyme in nerve cells. Another form of learning and memory called sensitization, which involves the learned reactions to something threatening, is associated with release of another substance called serotonin. "We've identified all the major wheels" in memory, Kandel says, but "there must be wheels within wheels."
Ultimately, researchers believe they will be able to probe memory and thought with various imaging techniques, such as improved techniques for scanning the brain. "We suspect that we will be able to see the little red ball someone is remembering, or the smell of the rose, by imaging or through electrical recording," says Mishkin.
The goal is to understand memory so well that one day it will be possible to repair -- or perhaps even improve -- recall.
"My hope is that as one pushes the analysis to the molecular level, one will be able to put one's hands on important regulatory molecules, which then either can be given in terms of supplements or will give insights into the kinds of things one might do, behaviorally, pharmacologically or dietetically," to improve memory, says Kandel. "I think that we are eons away from that, but obviously that's one thing that we'd like to do."