Of all the scientific enterprises that we taxpayers support, none is more important--from both a practical and a purely intellectual viewpoint--than neuroscience, the study of the brain. It is the brain that gives rise to perception, memory, emotion, language and all the other mysterious phenomena that constitute our minds--and have perplexed us for generations. The brain is also, in a sense, the source of our most pressing social problems: war, racism, poverty, pollution, crime. The American Psychiatric Association has estimated that almost 40 million adult Americans suffer from mental disorders, including substance abuse. The estimated costs of these ailments to society exceed $400 billion a year, more than the costs of cancer, heart disease and AIDS combined.

Perhaps it is not surprising, then, that nine years ago Congress designated the 1990s "The Decade of the Brain." What may be surprising is that not all neuroscientists were pleased by this designation. The Decade of the Brain was a "foolish idea," grumbled Torsten Wiesel, who won a Nobel Prize in 1981 for his work on the neural underpinnings of vision. "We need at least a century, maybe even a millennium, to understand the brain," Wiesel told me.

Wiesel is alluding to a well-kept secret of his field. Although neuroscientists are accumulating data about the human brain at a prodigious pace, the human mind remains, in many respects, as mysterious as ever. Researchers proclaim that they are on the brink of uncovering the keys to aggression, depression, addiction, schizophrenia, even consciousness itself. The problem is, they never quite get there. The haunting question, based on the record of neuroscience thus far, is whether they ever will.

Superficially, the picture for neuroscience has never looked brighter. Money for research, interestingly, is not a problem: The annual budget of the National Institute of Mental Health, the major federal funder of neuroscience research, has more than doubled over the past decade to $859 million, and membership in the Society for Neuroscience swelled from 17,524 in 1990 to more than 28,000 this year. Those who attend the society's annual meeting later this month in Miami will be able to choose among more than 13,000 presentations and abstracts on brain-related phenomena, from mental illness to sleep and consciousness. General-interest science magazines are already crammed with reports about neuroscience, and journals devoted to the topic are proliferating.

Technology is driving this explosion. Researchers have acquired an ever-more-potent array of tools for probing neural phenomena. They can watch the entire brain in action with positron-emission tomography (PET) and magnetic-resonance imaging (MRI). With microelectrodes, they can monitor the minute electrical impulses passing between individual nerve cells. They can trace the effects of specific genes and neurotransmitters on the brain's functioning. And they can model these events with powerful computers.

Neuroscientists are clearly getting somewhere. The question is, where? I once asked Gerald Fischbach, director of the National Institute of Neurological Disorders and Stroke and a former president of the Society for Neuroscience, to name the most important accomplishment of his field. He smiled at the question. The field's most striking characteristic, he said, is its constant and prolific production of findings. Researchers keep discovering new neurotransmitters; neural receptors, the lumps of protein on the surface of neurons into which neurotransmitters fit; and neurotrophic factors, chemicals that guide the growth of the brain from the embryonic stage into adulthood. Studies of the brain have revealed "a diversity no one expected," Fischbach said.

Unfortunately, no one has any idea how the brain integrates the output of all its disparate components to create what we think of as a mind, or self. Neuroscientists sometimes call this conundrum "the binding problem." I would propose another term: the Humpty Dumpty Dilemma. Neuroscientists have done a great job of breaking the brain into pieces, but they have no idea how to put it back together again.

Indeed, what Fischbach was inadvertently highlighting is one of his field's most paradoxical and disconcerting features. As researchers learn more about the brain, it becomes increasingly difficult to imagine how all the disparate data can be organized into a cohesive, coherent whole. Neuroscientists have yet to achieve a unifying--or "reductionist"--epiphany for their field. Their progress is, instead, a kind of anti-progress.

Although the term "reductionist" is often used disparagingly, good science is reductionist by definition. At its best, science isolates a common element underlying many seemingly disparate events. Charles Darwin, for instance, showed that all the diverse species on Earth were created through a single process--evolution. And, in the last half-century, James Watson, Francis Crick and other molecular biologists revealed that all organisms share essentially the same DNA-based method of transmitting genetic information to their offspring.

But neuroscience has failed either to confirm or to rule out all the competing unified theories of human nature. Ask 100 different scientists how the mind works, and you will get 100 different answers, ranging from behaviorism and Darwinian theory to computer science and quantum mechanics. Incredibly, some leading neuroscientists still favor psychoanalysis, the baroque theory and therapy invented by Sigmund Freud more than a century ago.

There has also been a troubling schism between neuroscience and what one might expect to be its chief beneficiary, psychiatry. Neuroscientists have found no reliable neurological markers that would illuminate and simplify the diagnosis of diseases such as schizophrenia, bipolar disorder and depression. Most of the medications used to treat mental illness, such as lithium, were discovered through serendipity, and those drugs have limited effectiveness.

Perhaps the most profound insight of neuroscience is that the brain is not a homogeneous, all-purpose computer but a collection of "modules" dedicated to different tasks. Scientists have learned a great deal about which parts of the brain underlie which functions by studying brain-damaged patients. Depending on where the damage occurs, the patient may be unable to recall the names of people, of animals or of inanimate objects. Others can no longer decode irregular verb endings.

Brain damage can also result in dramatic additions to, rather than subtractions from, a person's psyche. Physicians have reported more than 30 cases of a condition known as gourmand syndrome, in which damage to the right frontal lobe results in an obsession with fine food. A Swiss political journalist made the most of his condition; after recovering from his stroke, he started writing a food column.

Modern brain-scanning technologies such as MRIs have accelerated the fragmentation of the brain and mind. Researchers claim to have pinpointed the seat of musical ability, mathematical talent or obsessive-compulsive behavior. This trend is reminiscent of phrenology, the 19th-century pseudo-science that linked bumps on the skull to personality traits such as hotheadedness or dishonesty.

Optimists hope that neuroscience will be delivered from its current impasse by a genius who discerns patterns and solutions that have eluded his or her predecessors. The history of other scientific fields provides some justification for this hope. During the 1950s, particle physics was mired in a crisis that in some ways resembled the plight of neuroscience today. Accelerators seemed to generate an exotic new particle almost daily; theorists had no idea how to organize the welter of findings into a cohesive theory. Then, a brilliant young theorist named Murray Gell-Mann showed that many of these strange particles were made of a few more fundamental particles called quarks. Out of chaos, order.

But in terms of sheer complexity, particle physics is a child's game, a 10-piece jigsaw puzzle of Snow White compared with neuroscience. Neutrons, electrons and other particles are all identical; a theory that applies to one proton applies to all. But each brain is unique, and an individual brain can change dramatically when its owner is spanked, learns the alphabet, reads "Thus Spake Zarathustra," takes LSD, falls in love, gets divorced, undergoes Jungian dream therapy or suffers a stroke. When it comes to the human brain, there may be no unifying insight that transforms chaos into order.

I am not alone in arriving at this pessimistic conclusion. Howard Gardner, the Harvard psychologist and educator, has declared that "the phenomena of sensation, perception or other psychological states will never be reducible" to a strictly neural theory. The neuroscientist Gunther Stent of the University of California at Berkeley put it even more succinctly when he asserted that "the brain might not be capable, in the last analysis, of providing an explanation of itself."

Last year, after I presented some of these arguments to a group of neuroscientists at the California Institute of Technology, a researcher angrily asked me what my point was. Did I think he and his colleagues should give up? Should Congress take away their funding? I answered no to both questions. And I stand firmly by that response. Neuroscience's potential is so vast that it cannot be abandoned now, or ever. As long as we remain mysteries to ourselves, as long as we suffer, as long as we have not descended into a utopian torpor, we will continue to ponder and probe ourselves with the instruments of science. How can we not? Inner space may be science's final--and eternal--frontier.

John Horgan is the author of "The End of Science." This article is adapted from his new book, "The Undiscovered Mind: How the Human Brain Defies Replication, Medication and Explanation" (Free Press).