washingtonpost.com  > Print Edition > Sunday Sections > Washington Post Magazine
Page 2 of 5  < Back     Next >

See No Bias

There is likely a biological reason people so quickly make assumptions -- good or bad -- about others, Banaji says. The implicit system is likely a part of the "primitive" brain, designed to be reactive rather than reasoned. It specializes in quick generalizations, not subtle distinctions. Such mental shortcuts probably helped our ancestors survive. It was more important when they encountered a snake in the jungle to leap back swiftly than to deduce whether the snake belonged to a poisonous species. The same mental shortcuts in the urban jungles of the 21st century are what cause people to form unwelcome stereotypes about other people, Banaji says. People revert to the shortcuts simply because they require less effort. But powerful as such assumptions are, they are far from permanent, she says. The latest research, in fact, suggests these attitudes are highly malleable.

Such reassurance has not assuaged test takers, who are frequently shocked by their results. The tests are stupid, and the results are wrong, some say. People have argued that the tests are measures of only hand-eye coordination or manual dexterity. Some have complained about which groups are assigned to the left- and right-hand keys, and about how the computer switches those categories. None of these factors has any real impact on the results, but Banaji believes the complaints are a sign of embarrassment. Americans find evidence of implicit bias particularly galling, Banaji theorizes, because more than any other nation, America is obsessed with the ideal of fairness. Most of the people approached for this article declined to participate. Several prominent politicians, Republican and Democrat, declined to take the tests for this article. The aide to one senator bristled, "You think he is a racist!"

Harvard's Mahzarin Banaji is one of three researchers who developed the Implicit Association Test. (Stella Johnson)

_____Message Boards_____
Post Your Comments

But the tests do not measure actions. The race test, for example, does not measure racism as much as a race bias. Banaji is the first to say people ought to be judged by how they behave, not how they think. She tells incredulous volunteers who show biases that it does not mean they will always act in biased ways -- people can consciously override their biases. But she also acknowledges a sad finding of the research: Although people may wish to act in egalitarian ways, implicit biases are a powerful predictor of how they actually behave.

PEOPLE WHO FIND THEIR WAY TO THE HARVARD WEB SITE THAT HOSTS THE IMPLICIT ASSOCIATION TEST are asked a few questions about themselves. The tests are anonymous, but volunteers are asked about their sex, race and whether they consider themselves liberal or conservative.

The voluntary questionnaires have allowed Banaji and her colleagues to arrive at one of the most provocative conclusions of the research: Conservatives, on average, show higher levels of bias against gays, blacks and Arabs than liberals, says Brian Nosek, a psychologist at the University of Virginia and a principal IAT researcher with Greenwald and Banaji. In turn, bias against blacks and Arabs predicts policy preferences on affirmative action and racial profiling. This suggests that implicit attitudes affect more than snap judgments -- they play a role in positions arrived at after careful consideration.

Brian Jones, a Republican National Committee spokesman, says the findings are interesting in an academic context but questions whether they have much relevance in the real world. "It's interesting to ponder how people implicitly make decisions, but ultimately we live in a world where explicit thoughts and actions are the bottom line," he says. Volunteers drawn to the tests were not a random sample of Americans, Jones adds, cautioning against reading too much into the conclusions.

Though it's true that about two-thirds of test takers lean liberal, Banaji says, the sample sizes are so large that randomness is not a serious concern. And Andy Poehlman, a graduate student at Yale, has tracked 61 academic studies using the IAT to explore how implicit attitudes predict people's actions.

When volunteers who took the race bias test were given the option to work with a white or black partner, one study found, those with the strongest implicit bias scores on the test tended to choose a white partner. Another study found that volunteers with lower bias scores against gays were more willing to interact with a stranger holding a book with an obviously gay theme. A third experiment found that when volunteers were told that another person was gay, those whose scores indicated more bias against gays were more likely to avoid eye contact and show other signs of unfriendliness. A study in Germany by psychologist Arnd Florack found that volunteers whose results suggested more bias against Turks -- an immigrant group -- were more likely to find a Turkish suspect guilty when asked to make a judgment about criminality in an ambiguous situation.

In another study by psychologist Robert W. Livingston at the University of Wisconsin, Poehlman says, volunteers were given details of a crime in which a Milwaukee woman had been assaulted, suffered a concussion and required several stitches. In this case, Poehlman says, some volunteers were told the perpetrator had been proven to be David Edmonds from Canada. Others were told the guilty perpetrator was Juan Luis Martinez from Mexico. Volunteers were asked what length of sentence was appropriate for the crime: Bias scores against Hispanics on the implicit tests tended to predict a longer sentence for the Mexican.

An implicit attitude "doesn't control our behavior in a be-all and end-all kind of way, but it flavors our behavior in a pretty consistent way," says Poehlman.

In perhaps the most dramatic real-world correlate of the bias tests, economists at the Massachusetts Institute of Technology and the University of Chicago recently sent out 5,000 résumés to 1,250 employers who had help-wanted ads in Chicago and Boston. The résumés were culled from Internet Web sites and mailed out with one crucial change: Some applicants were given stereotypically white-sounding names such as Greg; others were given black-sounding names such as Tyrone.

Interviews beforehand with human resources managers at many companies in Boston and Chicago had led the economists to believe that black applicants would be more likely to get interview calls: Employers said they were hungry for qualified minorities and were aggressively seeking diversity. Every employer got four résumés: an average white applicant, an average black applicant, a highly skilled white applicant and a highly skilled black applicant.

The economists measured only one outcome: Which résumés triggered callbacks?

To the economists' surprise, the résumés with white-sounding names triggered 50 percent more callbacks than résumés with black-sounding names. Furthermore, the researchers found that the high-quality black résumés drew no more calls than the average black résumés. Highly skilled candidates with white names got more calls than average white candidates, but lower-skilled candidates with white names got many more callbacks than even highly skilled black applicants.

"Fifty percent? That's huge," says Sendhil Mullainathan, an economist who led the study and who recently moved to Harvard to work with Banaji. Human resources managers were stunned by the results, he says. Explicit bias, says Mullainathan, can occur not only without the intent to discriminate, but despite explicit desires to recruit minorities. Implicit attitudes need only sway a few decisions to have large impact, he says. For example, if implicit bias caused a recruiter to set one résumé aside, it could be just one of 100 decisions the recruiter made that day. Collectively, however, such decisions can have dramatically large consequences.

SAJ-NICOLE JONI WAS THE FIRST WOMAN TO BE HIRED AS AN APPLIED MATHEMATICS PROFESSOR AT MIT. It was 1977, and there were no women's bathrooms in her building. Joni was not particularly surprised. She had battled obstacles all her life. When she first declared -- at age 12 -- that she was going to be a mathematician, her announcement evoked gales of laughter at a family gathering. But opposition only made her more determined. After a successful stint at MIT, Joni worked for Microsoft and then launched a successful business consulting firm called the Cambridge International Group Ltd. Her recent book, The Third Opinion, stresses the importance of seeking diverse points of view.

Joni was recently introduced to Banaji and expressed interest in taking the Implicit Association Test. Like most volunteers, she did not think she had biases and believed strongly in "meeting people as they are, without looking at the color of their skin."

Given Joni's background, Banaji thought it would be interesting for her to take a bias test that examined whether Joni associated men or women with careers in science. Most people find it easier to associate men with the sciences -- but Joni was clearly not most people.

The test came up on the screen. Joni's fingers, trained for many years on the piano, flew as she classified a number of words such as "husband," "father," "mother" and "wife" between "male" and "female" groups. She then grouped words such as "chemistry," "history," "astronomy" and "music" under "science" or "liberal arts." The computer then asked her to group "male" with "science" and "female" with "liberal arts."

When the groupings were reversed, Joni had to group "male" words with "liberal arts," and "female" words with various disciplines in science. She made a mistake in classifying "uncle." She hesitated over "astronomy" and made a second mistake in classifying "physics."

The results popped up: "Your data show a strong association between science and Male relative to Female."

Joni's fingers tapped the table in frustration. "I fought for women to be scientists all my life," she said, incredulous. Banaji nodded sympathetically. Her own results on this test were similar.

While Banaji says such results show the pervasive power that cultural biases have even on those who are themselves the victims of such biases, critics of the Implicit Association Test have asked whether it might be merely measuring people's awareness of bias. In other words, might Joni and Banaji associate men with careers in science precisely because, as women who chose to be scientists, they were intimately familiar with the obstacles? Alternatively, could the tests be picking up something about the larger culture, rather than about the individual herself?

Banaji says that researchers have shown the implicit tests are measuring more than mere awareness of bias, through studies that cancel out the effects of familiarity.

"Is the IAT picking up something about the culture?" Banaji asks. "Yes, but it is picking up that aspect of the culture that has gotten into your brain and mind."

On the race test, for example, a sophisticated brain-imaging study showed that implicit bias tests can predict fear responses among volunteers. Banaji and New York University neural scientist Elizabeth Phelps had white volunteers take the implicit race bias test and then undergo sophisticated brain scans called fMRIs, which measure instantaneous changes in brain activity. Those with the most bias on the implicit tests showed the most activity in the brain area called the amygdala, when photos of black faces, obtained from college yearbooks, were flashed before their eyes. The amygdala is part of the primitive brain involved with fear responses.

< Back  1 2 3 4 5    Next >

© 2005 The Washington Post Company