Believing Is Seeing
A nightmare for your consideration: epidemics of genital shrinkage. Known in Asia as "koro." Happens without warning, hundreds affected. A Singapore hospital reported 75 cases in a single frantic day in 1967. The victims usually male, believe that their genitals are retracting into their body, and that complete retraction will be fatal. Thus they take countermeasures.
"Victims used everything from rubber bands to clothes pins in desperate efforts to prevent further perceived retraction."
That's from Hoaxes, Myths and Manias, by Robert Bartholomew and Benjamin Radford. The authors say this is a classic case of a mass delusion. From an initial rumor emerges a viral lunacy that sends otherwise sane individuals to the hospital clutching what is left of their once-beloved private apparatuses.
Personally, I would use duct tape and bungee cords.
Why do people believe things that aren't true? Mass delusions remind us of something counterintuitive: Bad information survives by building constituencies. You'd think that a crazy idea would have a tough time persuading lots of people that it's true, but crazy ideas find safety in numbers. It's hard for one person to believe something nutty, but easy for 20 people, and if several thousand people sign up for the idea, their group can probably get tax-exempt status.
A classic mass delusion took place in 1944 in Mattoon, Ill., when citizens became convinced that a "mad gasser" was going around spritzing people with a sickening gas. The panic lasted two weeks, even though it became clear that the difficulty of apprehending the mad gasser was directly related to the fact that he did not, in fact, exist.
Truth is innately elusive. Everyone of my generation grew up knowing that "Mama" Cass Elliot died from choking on a ham sandwich, which is one of those Untrue Facts, like cats sucking the breath from sleeping infants or sunbeams causing dust to rise from the floor. (An autopsy determined that Elliot died of heart disease.) Even certain incontrovertible scientific truths, such as the fact that Earth spins 365 and 1/4 times on its axis in a year, aren't actually true. It spins 366 and 1/4 times. You can look it up.
None of us is perfectly rational. When I start to get a cold, I take a trendy herbal medicine whose status as a miracle drug is based on a rumor passed around in the school parking lot. But I'm not spooked when I encounter the number 13 or a black cat crossing my path, or suddenly see pentagrams drawn in blood, after which the elevator doesn't stop at the basement but keeps going down and down and down and gets hotter and hotter amid an increasingly stygian odor and the deafening lamentations of the damned. That sort of stuff is just coincidence.
Part of the willingness to believe spurious "facts" comes from a distrust of science. In Aliens in America, for example, political scientist Jodi Dean writes: "To claim to have seen a UFO, to have been abducted by aliens, or even to believe those who say they have is a political act . . . It contests the status quo . . . Given the political and politicized position of science today, funded by corporations and by the military, itself discriminatory and elitist, this attitude toward scientific authority makes sense."
Except what they believe isn't true. That's not a political observation, unless insisting on objective reality can be considered political. And if it can, I'd like to sign up for the political party that's in favor of truth.
It's good to have beliefs, but it probably makes sense to have an exit strategy. You don't want your self-image depending on a belief that might not hold up. It's always a shame when you see someone shattered by the realization that their favorite Beatles song wasn't written by John but by Paul.
Years ago, I interviewed some interesting people who called themselves the Starseed and who believed they were alien entities from the Pleiades. They seemed completely normal, except for the alien thing. Most people with crazy beliefs are actually sane. They will even do research and seek verification or, more precisely, affirmation. The Starseed had a rule: "We will not invalidate a member's beliefs, opinions or experiences."
This is also how policy gets made in Washington.
Want to know how we get in protracted messes that could have been foreseen and prevented? By listening to the people who affirm our beliefs. By creating an ideological support group. By forming a mini-think tank called the People Who Agree With Me Institute.
Our leaders are not, as the Weekly World News continually reports, alien beings from outer space. But some do have a little bit of Starseed in them. They should be careful: They're candidates for serious shrinkage.
Read Joel Achenbach weekdays at washingtonpost.com/achenblog.