You're losing it. One day you start to make a phone call, but the doorbell rings. Three minutes later you're back, staring down at the receiver and wondering who you were going to call -- and why. You shrug it off. But then at the big meeting, you're introducing the West Coast project director. Suddenly your jaw plops agape. There is a horrible pulsing black void where her name used to be. Brain cancer? Early Alzheimer's? Alien possession?

No -- just life as we're getting to know it. Memory declines naturally and inevitably with age, starting around 30, with some functions decaying as much as 50 percent by the time we hit the geezer threshold. The good news is that most normal failures are limited to three kinds of situations in which: you're overloaded with arriving inputs, i.e., 14 new faces and names, there is limited time available for recall or multiple tasks have to be performed amid a welter of distractions. The bad news is that, taken together, those three conditions are a fairly accurate description of contemporary existence.

As a result, mnemonic anxiety has become epidemic among Baby Boomers, especially info-whelmed overachievers here in Memo City. "You wouldn't believe the number of calls we get from people over 40 who are very, very worried," says psychologist Thomas Crook, president of Memory Assessment Clinics, a worldwide testing network headquartered in Bethesda. "They're the same highly motivated people who are also working out in gyms. They not only want to prevent losses -- they want an edge on the competition."

That doesn't surprise psychiatrist Joseph Mendels, director of Philadelphia's Memory Institute: "First, more people are living longer -- so naturally, we're seeing more of the memory problems associated with aging. Second, the nature of society is changing. Once, a substantial percentage of American workers was employed in blue-collar jobs where details of memory were probably less important. Today, with the growth of service industries, memory plays a more central role. And finally, the news media have made people much more aware of the prevalence of Alzheimer's disease, which may affect as many as 4 million Americans. The public is consequently more sensitive." Add to this a culture that puts a rising premium on fast fact-retrieval, and you've got the makings of pop hysteria.

But you must remember this: A glitch is just a glitch. If you're like most data-frazzled urbanites, you're overreacting. "The correlation between people's complaints and how they actually perform on memory tests is pretty poor," says Crook. "About 72 percent think they're worse than they are." And worrying will only aggravate the problem by invoking one of the three conditions guaranteed to ruin your memory: Anxiety, depression or plain old failure to concentrate. Unfortunately, in our beeper-shriek, car-phone, E-mail, FAX-back world, paying full attention to anything gets harder every year. No wonder that "most people's complaints about memory," says Mendels, "relate more to initial attention problems rather than actual memory failures."

In fact, the explosion of neurological knowledge in the past few years has revolutionized our understanding of the miraculous molecular palimpsest we call memory. And dispelled a few myths along the way: It now appears that even under optimal circumstances we don't remember some things very well. A number of researchers argue that our brains were never designed to be factually accurate in the first place. And there is encouraging new evidence that some mental functions are impervious to age damage, while others may actually improve with the years. Memory as Fiction

Scientists themselves disagree on what memory is and even on how to describe it. Sometimes, it is defined by its duration: "Primary" memory, the portion of active consciousness that lets you repeat a sentence you just heard, rarely declines with age; "secondary" memory, which lasts from a few seconds to a few days and includes things like who borrowed your pen, erodes as we get older. Alternatively, memory can be divided by function: Implicit, involving learned skills such as swinging a tennis racquet or speaking French that become "automatic"; semantic, comprising objective facts like the date of the Battle of Hastings, general knowledge and information independent of context; and episodic, concerning specific events defined by time, place and personal history. Only the last of these three degenerates dramatically over time.

And much of that may not be accurate anyway. The more we begin to understand how the neural networks in our brains work, the more it appears that the very same processing architecture that permits us to recognize Uncle Ned (even though we've seen only a tenth of his face in a crowd) may make our memories suspect. As Jeremy Campbell explains in "The Improbable Machine": Since it is in the nature of our brain-cell networks to "fill in the missing parts of information that is incomplete, the version of the world that is tossed up to consciousness may be largely fictitious."

Of course, it has traditionally been assumed that certain kinds of recall are extremely and indelibly accurate -- especially "flashbulb" memories, those imprinted with a great deal of emotional force, such as what you were doing when you first heard that JFK had been shot.

When the shuttle Challenger exploded in 1986, Emory University cognitive psychologist Ulric Neisser saw a unique opportunity to test that theory. The morning after the disaster, he asked 100 students to write a detailed account of when, where and how they got the news, along with their feelings at the time. Then in '88 and '89, he located several dozen of those same students and again asked them to recall the events.

Nearly half were "dead wrong," and only a handful were near accurate. Worse yet, few students -- whose brains were presumably at their mnemonic peak -- even remembered filling out the original questionnaire. "There were a lot of retrospective constructions," Neisser says. "Many people thought they'd first seen the news on television. Whereas we know in fact that they heard it from the cleaning lady. At some point, they did see it on television, and that visual image stuck in their minds. Everything else they made up."

And, says Neisser -- one of the world's leading authorities on cognitive memory -- that may be the way we're designed to function. The need for perfect recall of every prior event is overrated. "What's the evolutionary advantage to that kind of precision? Those stimuli are never going to recur in exactly the same way. We only have to be approximately right." Maybe episodic memories aren't even supposed to be accurate. Instead, "they serve a social purpose -- we talk about them, mothers with children, friends with friends." They help people understand each other's feelings and preferences, and "for children between 3 and 5, they teach how events and relationships are extended in time. More and more, I'm inclined to think that's what the point of remembering the past is."

Nor will perfect recall make us smarter. "Intelligence is based on semantic memory, not on episodic memory," Neisser says, and the latter may just get in the way. "Suppose the teacher is saying, 'If Johnny has three apples and Mary has two apples . . .' It won't do you any good to jump up and say 'Hey, I remember very well that Johnny had only one apple!' " Evolution

Why would evolution ever have favored a system this sloppy? How come natural selection didn't just stomp out whichever primeval coot started forgetting his way back to the cave?

For one thing, our brains were never intended to last as long as they have. "If you take animals in the wild," says Gary Lynch, a behavioral neuroscientist at the University of California at Irvine, "they don't live even 25 percent of their potential maximum lifespan. So there's no evolutionary pressure for or against age-related deterioration."

For another, it may keep us alive longer. "Just imagine a young animal that has to survive," says Crook of the Memory Assessment Clinics. "At first, it really is necessary to recognize each specific detail, the structure of the environment, locations of food and enemies. But after a while, it is no longer adaptive to remember every single fact. Better to form generalities. This leads us to wonder: How much of the memory loss that occurs with age is not simply adaptive?"

Neuroscientist Gerald Edelman of the Rockefeller University believes that a kind of "neural Darwinism," or natural selection, is at work within individual neuronal groups: As an organism ages, certain groups of neurons are "selected" for reinforcement based on their fitness for adaptive behavior. A key criterion of fitness is the ability "to recognize large numbers of different examples of a category after initial confrontation with just a few of them." The most "fit" will be employed repeatedly in future behavior.

Examining the behavior of various creatures from pigeons, who can recognize shapes they've never seen before as "trees" or "non-trees", to four-day-old infants, who can discern categories of moving objects even when parts of the objects are not visible, Edelman argues that "memory is the enhanced ability to categorize or generalize associately, not the storage of features or attributes of objects as a list."

This ability not only permits an individual to deal efficiently with huge volumes of new information but "relieves the organism of the burden of storing large numbers of single instances." This evolutionary trade-off gives us remarkable abilities: The lowly pigeon can recognize characters from the "Peanuts" cartoon series when various parts of the figures are scrambled, turned upside-down, etc. But one price we pay, Edelman theorizes, is memory's lack of "completeness and error-free operation." Natural Aging

That's not much consolation if you've just lost your biggest account because you forgot the CEO's wife's name. But at least you're in good company. Recalling names is the nation's No. 1 complaint, followed by remembering written material right away and the all-time sitcom favorite, misplacing keys and eyeglasses.

You might as well get used to it: About 50 percent of folks over 50 will eventually suffer from age-associated memory impairment (AAMI), as Memory Assessment Clinics terms it. In general, this means that, in addition to names, you'll tend to forget what you were going to get at the grocery store, the news in this morning's paper and where you left your umbrella, dog or grandchild. You'll also have trouble remembering the increasingly complex data chains of modern life. MAC research indicates that young and old can handle seven-digit phone numbers about the same. But at 10 digits (soon to be required for local calls in the Washington area), older subjects do worse.

If that prospect is disturbing, there are plenty of proven methods to improve your memory. All you need is an active imagination -- and maybe a couple of Hershey bars.

One of the most successful techniques is among the most ancient: associating an image or extraneous idea with the thing to be remembered. Classical Greek and Roman orators used to remember long, complex speech outlines by imagining a familiar route and mentally "placing" each topic at sequential intervals. Then they would "walk" the route during the speech, encountering the right items in the right order. Modern memory-enhancement regimens use similar methods. You can be trained to improve face/name recall by superimposing a mnemonic image over the face (or other item) to be remembered. Thus Dorothy Grunwald might be a door in a green wall, Mike Lyons a lion holding a microphone, and so forth.

Studies of people with naturally occurring super memory abilities suggest that this visual-cue aspect is critical. Perhaps the most celebrated mnemonist of the century, a Russian newspaperman called "S" by his psychologist, Alexander Luria, could remember long strings of numbers for up to 30 years by simply "seeing" them in his mind and using the "memory walk" technique, in which he mentally placed objects to recall along a familiar Moscow street. "Sometimes I put a word in a dark place and have trouble seeing it as I go by," he complained to Luria.

It may also help to recreate as closely as possible the environment in which you first learned the things you're trying to recall. That, of course, is what happened to French novelist Marcel Proust when as an adult he dissolved his fabled madeleine in a spoonful of tea and inhaled the scent. The ensuing snout-jolt brought back seven volumes worth of scenes from his childhood. (But then his nose gave him a leg up. Odors are powerful provokers of emotion-laden memories, in large part because our olfactory processing equipment is wired directly into the mid-brain structures involved in both memory and emotion. By contrast, visual images go through an enormous amount of signal-processing back in the occipital lobe before being forwarded to the mid-brain for routing.)

In a recent neo-Proustian exercise, psychologist Frank Schab conducted an experiment at Yale to see whether odors could serve as a memory "retrieval cue." His hypothesis was that the environmental context in which we learn something is "encoded" along with the memory itself; thus evoking the context should make it easier to bring back the memory.

Several groups of students were asked to memorize a list of words. Some did so in a room pervaded with the odor of chocolate; others in normal air.

The groups were tested 24 hours later; again, some in a chocolate-scented atmosphere, some in plain. There was a dramatic difference in recall: Students exposed to the chocolate odor in both learning and testing scored 25 percent higher than those who got the scent during learning but not in testing -- and 50 percent better than those who had no odor in either session.

But then, you may not have to train your brain at all: New research suggests that the most widely feared aspects of aging -- forgetting to see the doctor, take medication or turn off the oven -- are less likely than they seem. Behavioral scientists have generally assumed that as people get older there is a dramatic decline in "self-initiated" recall (tasks in which one has to remember to remember) and that the decay is especially visible in "prospective" memory -- that is, the ability to remember to do something at a future time.

A recent study compared the performance of a group of 17-to-24-year-olds with a group ages 65 to 75. As expected, the oldsters scored far worse on "retrospective" memory tests -- the capacity to recall, for example, who guest-hosted the Carson show on Monday night. But there was no difference in prospective memory, especially when subjects were allowed to write themselves notes or devise other kinds of reminder aids.

Moreover, a new test called the Reflective Judgment Interview may provide a better measure of the way we actually use our memories in the second half of life -- when it becomes rather unlikely that we'll abruptly be ambushed and forced to name the capital of Nebraska. Instead of measuring rote memory, it asks subjects to analyze complex, so-called "ill-structured" problems -- questions such as "Does TV make you stupid?" that have to be pondered from various perspectives because they don't have "true" or "false" answers. On this test, those under 25 actually do worse than older people, who seem better equipped to marshal broad patterns of memories even if they can't recall specific information.

So quit worrying and be grateful for what you've got left. Sure, you've lost your wallet three times already this week and those upstart young people in the office seem to have supercooled Crays where their brains ought to be.

"But," Crook asks, "would you really want one of these kids running your company? Or the country?"

Curt Suplee writes on science and technology for The Washington Post.