When Seeing Is Disbelieving
Four years ago tomorrow, President Bush landed on the USS Abraham Lincoln and dramatically strode onto the deck in a flight suit, a crash helmet tucked under one arm. Even without the giant banner that hung from the ship's tower, the president's message about the progress of the war in Iraq was unmistakable: mission accomplished.
Bush is not the first president to have convinced himself that something he wanted to believe was, in fact, true. As Columbia University political scientist Robert Jervis once noted, Ronald Reagan convinced himself that he was not trading arms for hostages in Iran, Bill Clinton convinced himself that the donors he had invited to stay overnight at the White House were really his friends, and Richard M. Nixon sincerely believed that his version of Watergate events was accurate.
Harry S. Truman apparently convinced himself that the use of the atomic bomb against Japan in the fading days of World War II could spare women and children: "I have told Sec. of War to use [the atomic bomb] so that military objectives and soldiers and sailors are the target and not women and children," Truman noted in his diary.
Nor are U.S. presidents alone when it comes to deluding themselves: Successful politicians may just be more skilled at self-deception than the rest of us. Most people, perhaps all, seem hard-wired to be able to interpret reality to suit their ends.
Self-deception has been uncovered in a wide range of situations, says Robert L. Trivers, an evolutionary biologist at Rutgers University who has studied the phenomenon. Before the Challenger explosion, for example, NASA engineers noticed that one of the O-rings on the space shuttle had been eaten a third of the way through. Since the shuttle had flown and returned to Earth, the engineers concluded that it was not a problem. Surveys show that four in five high school seniors believe they have exceptional leadership ability, and nearly every single professor in the country believes he or she is above average.
During Colonial times, there were even people who managed to convince themselves that slavery was in the best interest of slaves; later on, some maintained that colonialism was in the best interest of poor countries.
War provides especially fertile soil for self-deception. Societies at war do not look kindly at derogatory assessments of their own fighting ability and motives, and they do not encourage talking up an enemy's strengths. This explains why both sides in many conflicts believe they are morally and militarily superior. (Each believes the other is deluding itself.)
Self-deception seems to be a universal trait, which presents an interesting problem for science, especially for scientists who study behavior from an evolutionary perspective. It makes sense for deception to abound in nature -- viruses find ways to sneak into our bodies, predators stealthily stalk prey, and countless species use camouflage to hide themselves from their attackers. But why would nature, after spending millennia evolving highly sophisticated senses to perceive the world, build in a psychological capacity that allows us to ignore what is right in front of our eyes?
Trivers says the primary use of self-deception appears to be that it aids people in deceiving others.
"Self-deception evolves in the service of deceit for two reasons," he said. "It improves your ability to fool others and, second, it reduces the cognitive costs of deception."
The thing to keep in mind, Trivers says, is that even as evolution rewarded deceivers, it also punished deceivers who got caught. (The ability to spot deception evolves along with the ability to deceive.)
Deliberate deception among humans, furthermore, requires effort. It requires you to hold both the truth and the untruth in your mind, and consciously suppress the truth. This is why the stereotype of liars depicts them with sweaty palms, croaking voices and shifty eyes -- lying can be hard work, and liars are often nervous about getting caught.
Self-deception, said Trivers, who has studied the phenomenon in contexts ranging from the Challenger explosion to a plane crash in Florida, offers a way around this psychological hurdle. If you can make yourself believe the untruth, for example, by marshaling evidence that supports your view and ignoring evidence that contradicts your position, it becomes that much easier to persuade others.
Like many other aspects of brain functioning, self-deception does not require people to sit down and decide they are going to lie to themselves. (That would actually defeat the point of self-deception.) No, it usually happens subtly, without the person even being aware of it.
"The costs of deception are being detected and punished," Trivers said. "There is definitely a downside to self-deception, and that is you are putting yourself out of touch with reality, but it cuts down the risk of getting caught."
Lyn Nofziger, a longtime adviser to Ronald Reagan, once said the same thing about his boss -- and about the utility of self-deception in politics: He could "convince himself that the truth is what he wants it to be. Most politicians are unable to do this, but they would give their eyeteeth if they could."