In What Counts: How Every Brain Is Hardwired for Math (Free Press, $26), cognitive psychologist Brian Butterworth argues that we are born with brain circuits specialized for answering the question "How many?" While all of us possess this Number Module, as he refers to it, Butterworth deftly slips in the question of whether a collection of experimentally confirmed number-crunching chimpanzees, ravens and at least one parrot possess a "predecessor" of our Number Module. Nor do we know if these savant-like animals use the same brain areas to carry out their numerical tasks.
We do know that adults with brain damage can lose the ability to perform numerical operations that would provide little challenge to the average primary grader. As examples Butterworth introduces us to a cast of fascinating patients including: Signora Gaddi, who despite otherwise normal cognition cannot count above the number four; Mr. Bell, whose understanding of speech or written language is almost nonexistent but who nevertheless retains a serviceable ability in arithmetic; Mr. Morris, who after hearing a series of numbers cannot repeat back more than two of them yet can carry out accurate mental calculations involving two three-digit numbers. On the basis of these examples Butterworth concludes that "arithmetical facts and arithmetical procedures occupy different circuits in the brain." Even more intriguing, "writing words and writing numerals, reading words and reading numbers all involve distinct brain circuits, despite having common input pathways from the eyes and common output pathways to the hands."
Given this emphasis on the brain as an explanation for mathematical abilities, Butterworth's conclusion that "anybody can be a math prodigy" comes as a surprise. To support this contention, he refers to a famous study by the French psychologist Alfred Binet showing that experienced cashiers at the Bon Marche department store in Paris could calculate more rapidly than two math prodigies competing against them. Butterworth favors the explanation that "in those days a cashier was recognized as highly skilled" rather than the more reasonable one that "self-selection played a part: those who couldn't do [math] or didn't enjoy doing it moved on to other positions within the store." At another point, using the English shorthand for "mathematics," he concedes what we mathophobics have always known: "Having good natural abilities for maths may be exactly the reason for choosing maths in the first place."
In Undiscovered Mind: How the Human Brain Defies Replication, Medication, and Explanation (Free Press, $25), John Horgan, former senior writer at Scientific American, aims at shifting our attention "from science's accomplishments to its limitations." Horgan is unhappy because the brain isn't fully understood and because cures for mental illness aren't currently available (the "explanatory gap"). Nor does he see much chance for improvement in the near future since neuroscientists "keep uncovering more and more complexity" and "researchers keep slicing the brain into smaller and smaller pieces with no end to the process in sight." As a result, he's concluded, "neuroscience's progress is really a kind of anti-progress." And in order to help rectify this sad state of affairs, Horgan decides to confront some of the more illustrious names in neuroscience about this "explanatory gap." His descriptions of the neuroscientists and other interviewees are universally unflattering and often tell us more about Horgan than about his subjects.
We learn that Yale researcher Patricia Goldman Rakic "chortled" and at one point in the discussion "leaned forward toward me and gripped my forearm"; that Freud critic Frederick Crews dresses "like an executioner"; that Joseph LeDoux, author of The Emotional Brain, is a "cool controlled man with deep-set eyes; that Eric Kandel, who is often mentioned for a Nobel Prize, "has dominated neuroscience for decades through a combination of brilliance and bullying"; that Harold Sackheim, who has "helped to rehabilitate the reputation of shock therapy," seemed during the interview to be "constantly gauging my reaction to his words."
In all instances Horgan finds the explanations of the researchers unconvincing because "when it came to the human mind I felt I was missing something." At the end of her meeting with the author, Goldman-Rakic reaches the same conclusion as this reviewer: "I think it's in your head that there's an explanatory gap."
Horgan is less sophomoric when he shifts his attention from the brain to related issues like evolutionary psychology, which he describes as "a strangely inconsequential exercise, especially given the evangelical fervor with which it is touted by its adherents." Since for some reason artificial intelligence doesn't raise his hackles, his discussion on that topic is lucid and informative -- indeed the best I've seen in a book aimed at a general audience. Interesting too is his conclusion that "when it comes to the human brain, there may be no unifying insight that transforms chaos into order."
In the early chapters of The Missing Moment: How the Unconscious Shapes Modern Science (Houghton Mifflin, $25), Robert Pollack, a professor of biological sciences at Columbia University in New York, provides a marvelous overview of memory, sensation and consciousness. He also provides lively and metaphor-rich descriptions of smell, vision, hearing and touch. Would you have guessed that "all the distinctions we make to see plaids, Kandinskys, traffic lights, and flowers are stimulated by wavelengths of light that are so close to one another, it is as if symphonies of music were being performed and perceived in the range of tones falling between a B and its nearest B-flat"? Or that our brain's " `wiring' keeps changing in response to the lives we lead" and "each of us is as different in terms of brain chemistry from all other people as our lives are from all other lives"? Or that the conscious present that we are experiencing right now actually occurred during a missing moment about a half-second ago? Pollock explains all this in wonderfully entertaining and lucid prose.
As a result you want to forgive him for the stale Freudianisms he trots out at odd moments in an otherwise excellent book. The Missing Moment would also be improved absent Pollock's gratuitous and usually irrelevant efforts at political correctness. For instance, the early attempts at vaccine development range, in his view, from "acts of criminal responsibility" to "uncontrolled unethical experiments." And Mary Lasker's pioneering work in establishing the American Cancer Society is diminished, in Pollock's eyes, by circumstances not under her direct control, i.e., "her family's success at creating effective advertisements for -- cigarettes." When Pollock lightens up on the sociopolitical stuff and sticks to what he knows best -- biology -- the results are engaging and entertaining.
In Origins of Genius: Darwinian Perspectives on Creativity (Oxford, $27.50), Dean Keith Simonton, a professor of psychology at the University of California at Davis, provides an encompassing overview of the factors that contribute to genius. As Simonton points out, "graduating summa cum laude doesn't necessarily predict the later receipt of a Nobel Prize." Certainly many geniuses achieve more as adults than their childhood teachers predicted. For instance, in a secondary school composition class D.H. Lawrence ranked 13th out of the 21 taking the course. Nor do all geniuses always make the I.Q. cut. When Lewis Terman first employed an I.Q. test to select a sample of child geniuses, he excluded a child whose I.Q. didn't make the grade. Decades later that child, William Shockley, the co-creator of the transistor, went on to win the Nobel Prize while not one of the 1,500 children selected by Terman made that prestigious grade.
Moreover, many of our most creative geniuses express a barely concealed antipathy not only toward I.Q. determinations and testing in general but also toward their entire formal educational experience. Einstein complained that "it is in fact nothing short of a miracle that the modern methods of instruction have not yet entirely strangled the holy curiosity of inquiry."
On the question of whether genius is inherited or developed, Simonton suggests a convenient and sensible response to this "nature versus nurture" controversy. He cites a "distinctive form of inheritance" consisting of multiple genetic components, all of which must be inherited in order to result in a genius. As an example he suggests the math genius Karl Friedrich Gauss, whose father was a bricklayer and whose mother was a peasant. "Apparently to become a Gauss requires a distinctive convergence of many abilities, interests and values. If a single attribute is missing, you no longer have a Gauss."
In addition, genius may depend on the multiplication rather than the addition of the various components. This would help explain why in any given creative field 10 percent or fewer of the creators are responsible for 50 percent of all the contributions. For instance, in classical music the works that make up the standard repertoire can be attributed to no more than 250 composers. Of these, 16 account for half of all the pieces regularly performed. Moreover, those composers ranked among the last 150 on the list can be credited with only one work each. "All of their compositions together represent 6.0% of the repertoire, which is less than Mozart's figure of 6.1%, and only slightly higher than the 5.9% each contributed by Bach and Beethoven."
Despite his contention that Darwinian theories best account for musical and other creative geniuses, Simonton admits the possibility of other explanations and sensibly concludes that "at present we lack a distinct theoretical system that will accommodate all creative activities in a coherent fashion."
Richard Restak is a neurologist and neurophysicist who has written a dozen books on the brain and behavior.