More news from:  Science  |  Environment  |  Health
Page 2 of 2   <      

If It Feels Good to Be Good, It Might Be Only Natural

Network News

X Profile
View More Activity

Such experiments have two important implications. One is that morality is not merely about the decisions people reach but also about the process by which they get there. Another implication, said Adrian Raine, a clinical neuroscientist at the University of Southern California, is that society may have to rethink how it judges immoral people.

Psychopaths often feel no empathy or remorse. Without that awareness, people relying exclusively on reasoning seem to find it harder to sort their way through moral thickets. Does that mean they should be held to different standards of accountability?

"Eventually, you are bound to get into areas that for thousands of years we have preferred to keep mystical," said Grafman, the chief cognitive neuroscientist at the National Institute of Neurological Disorders and Stroke. "Some of the questions that are important are not just of intellectual interest, but challenging and frightening to the ways we ground our lives. We need to step very carefully."

Joshua D. Greene, a Harvard neuroscientist and philosopher, said multiple experiments suggest that morality arises from basic brain activities. Morality, he said, is not a brain function elevated above our baser impulses. Greene said it is not "handed down" by philosophers and clergy, but "handed up," an outgrowth of the brain's basic propensities.

Moral decision-making often involves competing brain networks vying for supremacy, he said. Simple moral decisions -- is killing a child right or wrong? -- are simple because they activate a straightforward brain response. Difficult moral decisions, by contrast, activate multiple brain regions that conflict with one another, he said.

In one 2004 brain-imaging experiment, Greene asked volunteers to imagine that they were hiding in a cellar of a village as enemy soldiers came looking to kill all the inhabitants. If a baby was crying in the cellar, Greene asked, was it right to smother the child to keep the soldiers from discovering the cellar and killing everyone?

The reason people are slow to answer such an awful question, the study indicated, is that emotion-linked circuits automatically signaling that killing a baby is wrong clash with areas of the brain that involve cooler aspects of cognition. One brain region activated when people process such difficult choices is the inferior parietal lobe, which has been shown to be active in more impersonal decision-making. This part of the brain, in essence, was "arguing" with brain networks that reacted with visceral horror.

Such studies point to a pattern, Greene said, showing "competing forces that may have come online at different points in our evolutionary history. A basic emotional response is probably much older than the ability to evaluate costs and benefits."

While one implication of such findings is that people with certain kinds of brain damage may do bad things they cannot be held responsible for, the new research could also expand the boundaries of moral responsibility. Neuroscience research, Greene said, is finally explaining a problem that has long troubled philosophers and moral teachers: Why is it that people who are willing to help someone in front of them will ignore abstract pleas for help from those who are distant, such as a request for a charitable contribution that could save the life of a child overseas?

"We evolved in a world where people in trouble right in front of you existed, so our emotions were tuned to them, whereas we didn't face the other kind of situation," Greene said. "It is comforting to think your moral intuitions are reliable and you can trust them. But if my analysis is right, your intuitions are not trustworthy. Once you realize why you have the intuitions you have, it puts a burden on you" to think about morality differently.

Marc Hauser, another Harvard researcher, has used cleverly designed psychological experiments to study morality. He said his research has found that people all over the world process moral questions in the same way, suggesting that moral thinking is intrinsic to the human brain, rather than a product of culture. It may be useful to think about morality much like language, in that its basic features are hard-wired, Hauser said. Different cultures and religions build on that framework in much the way children in different cultures learn different languages using the same neural machinery.

Hauser said that if his theory is right, there should be aspects of morality that are automatic and unconscious -- just like language. People would reach moral conclusions in the same way they construct a sentence without having been trained in linguistics. Hauser said the idea could shed light on contradictions in common moral stances.

U.S. law, for example, distinguishes between a physician who removes a feeding tube from a terminally ill patient and a physician who administers a drug to kill the patient.

Hauser said the only difference is that the second scenario is more emotionally charged -- and therefore feels like a different moral problem, when it really is not: "In the end, the doctor's intent is to reduce suffering, and that is as true in active as in passive euthanasia, and either way the patient is dead."


<       2

© 2007 The Washington Post Company

Network News

X My Profile
View More Activity