Reviewed by Richard Restak
Sunday, December 17, 2006
THE EMOTION MACHINE
Commonsense Thinking, Artificial Intelligence, and the Future of the Human Mind
By Marvin Minsky
Simon & Schuster. 387 pp. $26
Writers about the human mind generally fall into three camps: philosophers, psychologists and others who weave elaborate theories about the mind without any reference to the brain; neuroscientists who attempt to link mind matters with brain states; and, finally, members of the computer science and artificial intelligence (AI) communities who suggest that it's possible to replicate human thinking in a machine. Marvin Minsky, professor of electrical engineering and computer science at the Massachusetts Institute of Technology and an early pioneer in developing artificial intelligence, is an eminent denizen of the third camp.
In The Emotion Machine, Minsky aims to find "more complex ways to depict mental events that seem simple at first." He brilliantly achieves this goal when he suggests that consciousness remains unexplained because it is "one of those suitcase-like words that we use for many types of processes, and for different kinds of purposes." Since consciousness is not a unity but involves separate mental components, "there is little to gain from wondering what consciousness 'is' -- because that word includes too much for us to deal with all at once."
Minsky does a marvelous job parsing other complicated mental activities into simpler elements. He discusses such topics as common sense, thinking and the self and -- most important for this book -- emotional states, which are "not especially different from the processes that we call 'thinking.' "
But he is less effective in relating these emotional functions to what's going on in the brain. Minsky says his book "does not discuss most current beliefs about how our brains work" because our knowledge about the brain soon becomes outdated. But then how can one draw meaningful correlations between brains and machines?
Equally unsettling, several of his points about the brain are not in line with current knowledge. For instance, it's not true, as Minsky claims, that "after certain major stages of growth in the brain, many new cells are later destroyed by 'post-editing' processes that evolved to delete some types of connections." Actually, the loss of cells results from passive disuse -- use it or lose it -- rather than active deletion.
Some of his other statements may be correct, but I wonder how one would go about proving them: "I suspect that large parts of our brains work mainly to correct mistakes that other parts make -- and this is surely one reason why the subject of human psychology has become so hard." This quirky and provocative assertion is based on the fact that "many computer systems eventually become so ponderous that their further development stops, because their programmers can no longer keep track of what all the previous programmers did."
This example, along with others throughout the book, assumes that computers and brains operate on similar principles. But testing that assumption, according to Minsky, isn't likely to be successful any time soon: "We learn more such details about the brain every week -- but we still do not yet know enough to simulate even a spider or snake." Given the limited state of our current knowledge, is it unreasonable to question the appropriateness of a machine model for human emotion?
Minsky proposed many of his ideas linking neuroscience with AI in his 1986 book, The Society of Mind. But in The Emotion Machine, he does not always account for more recent advances in our understanding of neurons (nerve cells). Of the 1.1 trillion cells in the human brain, only 100 billion are neurons, leaving an enormous number of cells that, neuroscientists are convinced, must be important in information transfer. Moreover, anatomical interaction of neurons highlights only one aspect of brain functioning. Equally important are alterations of the brain's chemical messengers, the neurotransmitters, along with changes in local and distributed electrical fields. A successful AI model of the mind must consider these features, as well.
Finally, applying to the brain such vague, ill-defined terms as "resources" doesn't adequately capture the brain's dynamism. Minsky admits as much, saying he can't identify these "resources" because "research on this is advancing so quickly that any conclusion one might make today could be outdated in just a few weeks."
In the final analysis, technical advances may offer our best hope when it comes to explaining how our minds work. Many states of mind -- fear, joy, desire -- can now be shown through brain imaging techniques. This would be closer to an "explanation" for the mind, it seems to me, than anything offered by Minksy's employment of such obscure terms as "imprimers," "trans-frames," "K-lines," "credit assignments" and "micronemes," which have no agreed-on scientific meaning and seem, as Minsky concedes, "hopelessly vague."
Despite these reservations, The Emotion Machine rewards careful reading. You'll learn a lot about how your mind works, even if you won't be all that much wiser about what is actually going on within your brain.
Richard Restak is a neurologist and the author most recently of "The Naked Brain: How the Emerging Neurosociety Is Changing How We Live, Work, and Love."