What is the physical basis of memory? How can a three-pound blob of goop the size of a softball actually "store" your cousin's latest phone number, the savor of grandma's rhubarb pie and the generalized instructions that allow you to drive an unfamiliar rental car -- among billions of other items?

Thanks primarily to rat research and exquisitely sensitive new microtechnologies, scientists are getting tantalizingly close to an answer. But don't throw out your Rolodex just yet. The brain may be the most complicated entity in the cosmos. The average human cranium contains around 100 billion cells called neurons, each of which is connected to 1,000 or more others at junctions called synapses -- for a total of tens of trillions of intercellular gaps. Your mileage may vary, but the number of possible pathways across the network is astronomically vast.

Apparently we got this way thanks to our diets: Animals with a wide spectrum of chow sources need large "ranging behaviors" and thus a lot more memory capacity to store food locations than stay-at-home leaf-eaters. In addition, food-storing species tend to develop commodious spatial memories, with concomitant enlargement of brain organs specializing in where-is-it? activities. We share this capability with such distinguished fauna as the crow and the nuthatch. But how, exactly, do we do it?

Nerve activity of any kind involves the transfer of a signal (electrical within each cell; chemical between cells) from one neuron to another. Each cell has a receiving system in the form of dendrites -- a fine web of fibers attached to sending terminals on hundreds of nearby cells; and each has an output system, or axon, capable of sending signals to other neurons' receivers across the synaptic gap. The cells obtain their electrical voltages from ions -- charged particles that are common in most body fluids.

The process begins when a neuron's receiving terminals are bombarded by various chemicals released by the sending cell. These messenger molecules, called neurotransmitters (some four dozen kinds are known), flood across the synaptic gap and find specific target locations on the other side, binding into the right type of receptor in a sort of lock-and-key process. Once there, they cause various kinds of reactions to take place. Some make the membrane of the receiving (post-synaptic) terminal more receptive to ions, opening special entry ports to allow electrical charges to build up in the receiving cell. The neurotransmitters are then broken down by enzymes or reabsorbed by the sending neuron. (Antidepressant and similar drugs work by controlling the activity of neurotransmitters such as serotonin.) All of this happens in a few thousandths of a second.

The receiving neuron collects all the incoming electrical signals. If the total net "weight" is below the cell's normal resistance threshhold, then the neuron simply goes back to what it was doing. If the weight exceeds the threshold, then an electrical potential builds up in the cell's central channel. This charge shoots down the axon until it reaches the end of a branch -- which acts as the sending side of another synapse. There the charge stimulates tiny sacs or vesicles filled with neurotransmitters, which release their contents into the new synaptic gap -- and the whole process starts over again. This sequence proceeds across fields of thousands of neurons at once, resulting in the cascade of signals that is your Social Security number, shoe size or lunch date.

If the right kind of sensory stimuli keep coming in to a certain area, whole arrays of brain cells get semipermanently aligned into assemblies that stimulate each other in dependably similar patterns. Scientists call this "long-term potentiation" (LTP) and it's what makes memory possible. It is reinforced as long as Stimulus A (the neighbor's new dog) keeps arriving in conjunction with Stimulus B (a menacing growl).

The problem is that your brain stores growls in one place (the auditory cortex) and dog pictures in another (the visual cortex), making it virtually impossible to connect them directly. Instead, you use a portion of the mid-brain known as the limbic system, and in particular an organ called the hippocampus, a pair of seahorse-shaped structures lying approximately between your ears -- as a sort of central switchboard to link different cell assemblies. In fact, says Mortimer Mishkin, chief of the laboratory of neuropsychology at the National Institute of Mental Health, lab studies show "that even when the cell assemblies are in the same tissue, they may not be able to excite each other except through the limbic system."

This may go a long way toward explaining one of the mind's many mysteries: the fact that a concussion often wipes out everything from a few minutes to days prior to impact, suggesting that memories spend a prolonged period in the hippocampus in a sort of mid-processing state between short-term and long-term memory. "We think," says Larry Squire, a memory scientist at San Diego Veterans' Affairs Medical Center, "that the hippocampal system continues to be involved in memory for a long time after the original learning. Head trauma especially disrupts the hippocampal function, causing memory loss that can extended back considerable time into the past -- but still leaving intact memories from childhood and adolescence."

But another mental mystery remains unsolved: Someone tells you a street address, and in less than a second it has been encoded into a string of altered electrochemical connections among whole arrays of neurons. How could such complex processing of information happen so fast?

Gary Lynch, a behavioral neuroscientist at the University of California at Irvine, looked at other body processes that work in a hurry, such as the clotting of blood, and theorized that an influx of new information (i.e., the torrent of incoming electrical charges) releases calcium ions that in turn trigger an enzyme called calpain. Enzymes are chemicals that serve as catalysts in various reactions and are particularly good at helping break compounds into component parts. And that, Lynch hypothesized, is what calpain does: It instantaneously tears down the cell's protein framework, or cytoskeleton, uncovering buried receptor sites and letting them poke out to form new synapses.

Even if this model proves accurate, however, it still doesn't explain what starts the process that keeps the chains of neurons tuned up to fire in consistent ways. There are two basic alternatives: Either the sending or pre-synaptic cell releases larger quantities of transmitters; or the receiving or post-synaptic cell develops a heightened ability to take in more ions for a given amount of stimulus. For years, betting was heavy on the post-synaptic side, especially following the discovery of a receptor for a kind of glutamate neurotransmitter called NMDA, which apparently plays a major LTP role in associative memory. (Another name for your classic Pavlovian dog sort of food-bell-and-drool conditioning.) But new research by Richard Tsien of Stanford and Charles Stevens of the Salk Institute has shifted attention to the pre-synaptic side. Both labs, using equipment sensitive enough to monitor the electrical potentials in a single cell, found that ion currents in cells under examination were more closely correlated with the amount of neurotransmitter released from the sending cell than with the ability of the receiving cell to absorb electrically charged particles.

Simple associative memory, of course, isn't quite the same thing as being able to recall and compare whole passages from "Paradise Lost." But scientists are betting that its basic cellular processes are similar to those involved in more complex recollections.

Daniel Alkon, chief of the laboratory of molecular and cellular neurobiology at the National Institute of Neurological Disorders and Stroke, has studied the way that associative-memory formation alters neural structures in rabbits and a certain homely sea snail whose nerve-cell structure has made it a favorite of memory researchers.

A neuron's "firing" strength is determined by the number and kind of ions that enter the cell. The membrane of an average, quiescent neuron minding its own business is not much inclined to receive impulses. Why? "Ordinarily," Alkon writes, "potassium-ion flow is responsible for keeping the charge on a cell membrane well below the threshhold potential at which propagating signals are triggered. When the flow of potassium ions is reduced, impulses can be triggered more readily."

What might reduce that flow and thus strengthen a neuron's ability to fire impulses and improve memory? Alkon and his colleagues have determined that in certain kinds of brain cells, an enzyme called protein kinase C (PKC) regulates the potassium-ion flow, and hence the cell's excitability. The researchers used classic Pavlovian conditioning, in which animals learn to remember the temporal connection between two events: In the case of the snail, it was a flash of light followed by water turbulence; for the bunnies, an audible tone preceding a puff of air in the eye. In both animals as they were conditioned to make the connection between these two events, the PKC enzyme in the affected neurons migrates from inside the cell to the membrane and stays there for long periods -- that is, makes the cells "conditioned." Understanding these and related molecular processes is essential to devising new kinds of drugs to improve memory and diminish the effects of conditions such as Alzheimer's.

Meanwhile, other cell-level research projects are examining the flip side of memory: Why it eventually fails. "There used to be this myth that we lost 100,000 neurons a day," says Squire. "But those estimates were based on brains that were probably not healthy." It now appears that, in the hippocampus at least, "we only lose about 100 a day, or perhaps 3 percent a decade."

But memory over time is still a no-win situation. It now seems that our very ability to build complex connections, known as "plasticity", may cause the cells to wear out and break down. Born to lose: "When we mammals bought this thing, say 200 million years ago, we didn't read the fine print," says Lynch. "Evolution said, 'I'll give you this great device that lets you change each synapse selectively. Oh, but by the way: As the brain ages and the calcium-buffering machinery begins to deteriorate with age, look out.' "

"The brain's utility, of course, is immense. If I were to tell you that you're going to be expected to store more information in the course of week than a supercomputer, you'd be amazed. But there's a horrible irony: The same machinery that you're using to change the contact -- if allowed to go too far -- will cause pathology."

Which is why Memory Assessment Clinics and other labs are now testing a number of drugs that may diminish recollective decay. Some affect specific neurotransmitters such as acetylcholine, serotonin and dopamine; some tone up neural membranes, which can lose their fluidity and receptivity with age. Whether or not such agents ever end up on pharmacy shelves, their study is revolutionizing the nature of brain science. In the past, doctors could only observe symptoms in their patients and then ask what was going on in the brain. "But now for the first in the history of memory research," Lynch says, "we can go from the biological mechanism back to behavior."