See No Bias

Network News

X Profile
View More Activity
By Shankar Vedantam
Sunday, January 23, 2005

AT 4 O'CLOCK ON A RECENT WEDNESDAY AFTERNOON, a 34-year-old white woman sat down in her Washington office to take a psychological test. Her office decor attested to her passion for civil rights -- as a senior activist at a national gay rights organization, and as a lesbian herself, fighting bias and discrimination is what gets her out of bed every morning. A rainbow flag rested in a mug on her desk.

The woman brought up a test on her computer from a Harvard University Web site. It was really very simple: All it asked her to do was distinguish between a series of black and white faces. When she saw a black face she was to hit a key on the left, when she saw a white face she was to hit a key on the right. Next, she was asked to distinguish between a series of positive and negative words. Words such as "glorious" and "wonderful" required a left key, words such as "nasty" and "awful" required a right key. The test remained simple when two categories were combined: The activist hit the left key if she saw either a white face or a positive word, and hit the right key if she saw either a black face or a negative word.

Then the groupings were reversed. The woman's index fingers hovered over her keyboard. The test now required her to group black faces with positive words, and white faces with negative words. She leaned forward intently. She made no mistakes, but it took her longer to correctly sort the words and images.

Her result appeared on the screen, and the activist became very silent. The test found she had a bias for whites over blacks.

"It surprises me I have any preferences at all," she said. "By the work I do, by my education, my background. I'm progressive, and I think I have no bias. Being a minority myself, I don't feel I should or would have biases."

Although the activist had initially agreed to be identified, she and a male colleague who volunteered to take the tests requested anonymity after seeing their results. The man, who also is gay, did not show a race bias. But a second test found that both activists held biases against homosexuals -- they more quickly associated words such as "humiliate" and "painful" with gays and words such as "beautiful" and "glorious" with heterosexuals.

If anything, both activists reasoned, they ought to have shown a bias in favor of gay people. The man's social life, his professional circle and his work revolve around gay culture. His home, he said, is in Washington's "gayborhood."

"I'm surprised," the woman said. She bit her lip. "And disappointed."

MAHZARIN BANAJI WILL NEVER FORGET HER OWN RESULTS THE FIRST TIME SHE TOOK A BIAS TEST, now widely known as the Implicit Association Test. But whom could she blame? After all, she'd finally found what she was looking for.

Growing up in India, Banaji had studied psychophysics, the psychological representation of physical objects: A 20-watt bulb may be twice as bright as a 10-watt bulb, for example, but if the two bulbs are next to each another, a person may guess the difference is only 5 watts. Banaji enjoyed the precision of the field, but she realized that she found people and their behavior toward one another much more interesting. The problem was that there was no accurate way to gauge people's attitudes. You had to trust what they told you, and when it came to things such as prejudice -- say, against blacks or poor people -- people usually gave politically correct answers. It wasn't just that people lied to psychologists -- when it came to certain sensitive topics, they often lied to themselves. Banaji began to wonder: Was it possible to create something that could divine what people really felt -- even if they weren't aware of it themselves?

The results of one of Banaji's experiments as a young scholar at Yale University encouraged her. She and her colleagues replicated a well-known experiment devised by psychologist Larry Jacoby. Volunteers were first shown a list of unfamiliar names such as Sebastian Weisdorf. The volunteers later picked out that name when asked to identify famous people from a list of famous and unknown names. Because they had become familiar with the name, people mistakenly assumed Sebastian Weisdorf was a famous man. The experiment showed how subtle cues can cause errors without people's awareness.

Banaji and her colleagues came up with a twist. Instead of Sebastian Weisdorf, they asked, what if the name was Sally Weisdorf? It turned out that female names were less likely to elicit the false-fame error; volunteers did not say Sally Weisdorf was a famous woman. Women, it appeared, had to be more than familiar to be considered famous. Banaji had stumbled on an indirect measure of gender bias.


CONTINUED     1                 >

More From The Washington Post Magazine

[Post Hunt]

Post Hunt

See the results from our crazy, brain-teasing game.

[Date Lab]

Date Lab

We set up two local singles on a blind date.

[D.C. 1791 to Today]

Explore History

3-D models show the evolution of Washington landmarks.

© 2005 The Washington Post Company

Network News

X My Profile
View More Activity