A new psychological test indicates that we all may be more prejudiced than we think.
Shankar Vedantam, whose article, See No Bias, about the test appeared in Sunday's Washington Post Magazine, was online Monday, Jan. 24, at 1 p.m. ET to field questions and comments.
Vedantam covers science and human behavior for The Post.
Editor's Note: Washingtonpost.com moderators retain editorial control over Live Online discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions.
Shankar Vedantam: Welcome everybody. Thanks for joining me to discuss my article in the Post's Sunday magazine, See No Bias.
We'll get to all the questions in a moment - keep them flying in! -- but I wanted to toss out a couple of things to think about.
The most difficult part of the story for me to get personally was the sense that people can be influenced by factors that they are not aware of. We all believe we are acting rationally, consciously, yet numerous studies show that we are routinely influenced by hidden factors. This is why people who explicitly hold no biases against anyone can implicitly have biases. Unlike previous models of prejudice, what this suggests is that active hostility is not needed for discrimination to occur.
One reason the IAT has generated so much interest is that it gets people to ask questions about themselves, to ask themselves how they reached certain opinions about others. Prompting that moment of reflection is really the point of the research, and the hope of my story.
Where are there copies of the test?
Shankar Vedantam: The Washington Post story can be found here.
The tests can be taken here.
Please note that as a result of heightened interest in the issue, the Harvard Web site crashed a couple of times over the weekend. The site was very slow when I tried to get access to it a moment ago. Be patient, or check back in after a day or two.
Finally, here's a link that answers many questions that inevitably arise. If I don't get to something in this chat, please check this out.
I found the description of the bias test very interesting. However, I wondered if the test itself might be slightly skewing results, by the order of the test sections. If the test associated the black faces with positive words first, and then the white faces with positive words, might the test taker show a different kind of bias? Has this been considered? I'm just thinking that people often commit things to memory by forming associations. If the first iteration of the test associates positive words with white faces, then during the second iteration, the test taker will very likely take longer because now he/she will be trying to throw out a newly formed association and replace it. I would suspect that someone who's memory doesn't work very well through association might score with a minimal bias, simply because of the order of the test sections. So while it seems like an interesting premise, the test (as described) seems like it might tell us more about how our memories work, then it might tell us about racial bias. I haven't taken the test yet, though I probably will... but I'd really like to take it in reverse order, and see what my results would be.
Shankar Vedantam: A good question and one that comes up regularly. There is a complicated and a simple answer. The simple answer is that the tests presented on the Harvard website have been randomized, so that users do not always get choices in the same order. Some volunteers get the positive associations first and the negative associations later and vice versa. The tests have also been randomized so that the left-right choices are different for different users. The data in the story are always drawn from randomized samples, so any ordering effects have been canceled out.
The complicated answer is that the order in which you take the tests can have a SMALL effect on results. Individuals might therefore want to take tests multiple times and average out their results.
You article was very interesting. However, it struck me as interested that there was not consideration made for three things in the test:
1. If you give the participant the choice to group the white faces with bad words first the answers may turn out different as usually when you are playing a game based on muscle memory the first way you play it usually is always the easier way.
2.Also the grouping names that are "traditionally black" vs ones that are white, is a very misleading test. Not all black people have those names, in fact they are a particularly modern and regionally biased names. If a person grew up in Savanna, it would be much more likely that the black people they knew where names Mike or Vernon and not Jamal or Alonzo. One set of names would be unfamiliar to a vast majority of the country. It would be better to use as the white names Sven and Bjorn because at least they would be as statistically familiar to different populations.
3. The reason the Sally name did not strike as many people as different as the Sebastian name is simple. I do not know any famous people named Sebastian so if I saw a name and it looked at all familiar I would consider that it was a famous scientist. However, as I am more comfortable with the name Sally I find it easier to cross off any list of potentially famous individuals. Have they tried using the name Sabine instead and see how that one worked?
Shankar Vedantam: The question about whether the order of the tests has any effect on the results has been answered above.
The question about "traditional names" is very good: In the earliest version of the race bias test, Tony Greenwald associated positive and negative words with stereotypically black and white names like Adam, Chip, Jamel, Alonzo etc.
But what happens with names like Thomas, which are traditionally used by both blacks and whites? The researchers acknowledged this problem, which is why, once the technology permitted it, they stopped using black and white names and instead used black and white faces. The race bias test that is currently available to the public on the Harvard website uses only the faces. The results using the faces confirmed the finding of the tests that used the names.
Finally, as to the Sally-Sabine question, the names used in the story (Sebastian and Sally) were only meant as examples. The bias is evident when different male and female names are used.
Woodley Park, Washington, D.C.:
How do we know this test actually works? It's much more abstract than the experiments that sent out resumes with stereotypical
"white" or "black" names -- it's not clear to me that just because you take longer to click at a given prompt on this test, it means you have any bias. Even in the space of such a long article, this question was not sufficiently addressed: Why should we trust Banaji's conclusions? Critics are quickly dispensed with in the space of about a paragraph, and "Banaji believes the complaints are a sign of embarrassment." So she's made her results irrebuttable: Challenge them and you're just mad that you were outed as biased! Can we have more of a scientific response than "You're just mad"/"You wouldn't understand?"
Shankar Vedantam: This is the "so what?" question. Are these time differences on the tests measuring anything important? I think the simplest way to answer that question is to present a section of the story, which shows the different ways in which these apparently "trivial" time differences are associated with behavior with real-world consequences. These results have been collected by Andy Poehlman of Yale, and are currently on its way to publication.
Please note that all the studies mentioned here were under controlled laboratory conditions. One concern with doing real-world tests, as the story notes, is the concern that such results will get used as weapons against individuals, which is something the researchers do not want to see happen.
"When volunteers who took the race bias test were given the option to work with a white or black partner, one study found, those with the strongest implicit bias scores on the test tended to choose a white partner. Another study found that volunteers with lower bias scores against gays were more willing to interact with a stranger holding a book with an obviously gay theme. A third experiment found that when volunteers were told that another person was gay, those whose scores indicated more bias against gays were more likely to avoid eye contact and show other signs of unfriendliness. A study in Germany by psychologist Arnd Florack found that volunteers whose results suggested more bias against Turks -- an immigrant group -- were more likely to find a Turkish suspect guilty when asked to make a judgment about criminality in an ambiguous situation.
In another study by psychologist Robert W. Livingston at the University of Wisconsin, Poehlman says, volunteers were given details of a crime in which a Milwaukee woman had been assaulted, suffered a concussion and required several stitches. In this case, Poehlman says, some volunteers were told the perpetrator had been proven to be David Edmonds from Canada. Others were told the guilty perpetrator was Juan Luis Martinez from Mexico. Volunteers were asked what length of sentence was appropriate for the crime: Bias scores against Hispanics on the implicit tests tended to predict a longer sentence for the Mexican.
An implicit attitude "doesn't control our behavior in a be-all and end-all kind of way, but it flavors our behavior in a pretty consistent way," says Poehlman.""
Hi there. Thanks for the thought-provoking article.
In the article, you mentioned the questionnaires where
test-takers identify their demographic info and political
leanings, and that the results of the questionnaires so far
have found that conservatives "show higher levels of bias
against gays, blacks and Arabs than liberals". This bias
supposedly predicts policy preferences on race-related
issues such as affirmative action and racial profiling.
My question is about the linkage between bias and policy
preferences, which was ignored in the article (unless I
missed something). Does "attribution" play a role in
linking bias and thought-out policy positions? That is, do
liberals, consciously or unconsciously think, OK, I may
have some bias against black/gay/Arab people, but that's
a result of internal beliefs, things that I've learned or been
taught, and my personal experiences, and is not a
reflection of society? Conservatives, by contrast, would
think, I'm biased, but my biases are justified based on the
"real world" and how people really act (i.e. external
I would hate to think that policy differences stem simply
from the level of bias between liberals and
conservatives. What do you think, and how do you think
IAT researchers would explain the link between bias and
Thanks again for the article!
Shankar Vedantam: First things first: Neither liberals nor conservatives say they are biased! Survey results show, in fact, dwindling numbers of people who admit to biases of any kind.
I think this gets to one of the more interesting facets of the research - it shows that implicit attitudes are quite different than explicit beliefs.
Finally, the story does not say that all policy positions are driven by bias. But it does suggest that biases may play a role in positions that people feel they have arrived at after careful consideration.
I'm sure the are dozens of comments like this in your queue, but I'll add one more vote:
Why should we assume that the association of negative words like "painful" with blacks indicates a negative attitude about blacks as people, as opposed to what we know about their circumstances? Compared to whites, blacks have less education and income and are more likely to be both crime victims and perpetrators. Most of us think of the life circumstances of blacks (considered as a group in their totality) as a problem to be solved, not the inevitable result of the inferiority of blacks themselves.
Shankar Vedantam: The issue of whether the biases unearthed by these tests are merely a sign of familiarity was a big area of criticism of the IAT in its early days. As the story notes, this issue has been addressed in two ways. First, it is possible to construct studies that eliminate familiarity as a factor. Second, and perhaps more important, there are numerous studies that show that IAT results are predictive of behavior. In other words, the fact that people more quickly associate words like humiliate and painful with gays and words like beautiful and glorious with heterosexuals is not very interesting until it was shown that these associations influence behavior toward gays.
It may also be important to note that biases are triggered by encounters with individuals - an individual black man, for instance - rather than an entire group of people. Obviously, individuals vary greatly in terms of socioeconomic status and education. We can all name black individuals who are wealthier and better educated than individual whites. Yet, our biases about groups get transferred to individuals. It is interesting to note that you can even show that asking volunteers to pay attention to different facets of an individual can produce different bias results. Asking volunteers to think about Michael Jordan as a brilliant athlete before taking race bias tests produces different results on average than asking them to think about Michael Jordan as a black man.
As before, people are not aware their attitudes have been subtly changed. I might mention in passing that there is a long history of priming studies that I could not get into in the article. Cueing particular ideas - something that marketers are very skilled at doing - can influence behavior without people's awareness of it.
Does your research demonstrate that minorities feel more positively toward whites compared to their own group? Some studies I conducted a few years ago using similar reaction time tests looking at within group biases show positive bias toward subject's own group, e.g. Hispanics are more positive toward Hispanics; Asians are more positve toward Asians compared to whites or other groups.
Cristina Ling Chard, Ph.D.
Evaluation Officer, World Bank Institute
Shankar Vedantam: The research does indeed show that many minorities internalize the same biases as majority groups. If you take a look at the graphic on Page 17 of the Sunday magazine, it shows that 38.4 percent of gay respondents show an anti-gay bias; 48.3 percent of blacks show an anti-black bias; and 36 percent of Arab Muslims show a bias against Arab Muslims. For majority groups, 82.5 percent of whites showed a pro-white or anti-black bias; 87.9 percent of straight respondents showed a bias against gays; and 68 percent of non-Arab non-Muslims showed a bias against Arab Muslims.
These findings, which are matched by results on other implicit tests, also address another issue. While minority groups may have biases against majority groups, it would be wrong to assume the biases of minority and majority groups against each other somehow "cancel" each other out. Rather, what the research suggests is that, on the race test for example, a pro-white bias is the norm across the culture - which is why an overwhelming majority of whites and a substantial number of blacks show the bias.
By the way, I was only the reporter on the story - this is not my own research!
This is terrifying, depressing stuff! On the one hand, I am thrilled that the Washington Post is shining some scholarly light on this aspect of our society. On the other hand, it kind of makes me want to crawl into a little ball and flog myself. How can we (the collective we) use this new information to be proactive instead of merely depressed?
Shankar Vedantam: Thank you for your kind words about the story. I sure hope the article doesn't make anyone flog themselves! Indeed, as the story notes, there is much cause for optimism because there is clear evidence that even as such biases are widespread, they can also be reversed by individuals, should people be so inclined.
In the interest of getting to as many questions as possible, I'm going to present an excerpt from the story here.
"There is growing evidence that implicit attitudes can be changed through exposure to counter-stereotypes. When the race test is administered by a black man, test takers' implicit bias toward blacks is reduced, says Irene Blair, a University of Colorado psychologist who recently conducted a review of studies that looked at how attitudes could be changed. Volunteers who mentally visualized a strong woman for a few seconds -- some thought of athletes, some thought of professionals, some thought of the strength it takes to be a homemaker -- had lower bias scores on gender tests. Having people think of black exemplars such as Bill Cosby or Michael Jordan lowered race bias scores. One experiment found that stereotypes about women became weaker after test takers watched a Chinese woman use chopsticks and became stronger after they watched the woman put on makeup. Interventions as brief as a few seconds had effects that lasted at least as long as 24 hours. But the volunteers were not aware of their attitudes having been changed.
Having counter-stereotypical experiences, in other words, might be much like going on a new diet with healthier food. Just as healthy eating can have a subtle impact on how people look and feel, counter-stereotypical experiences sustained throughout one's life seem to subtly change how one thinks."
As an individual that recruits individuals to fill vacant positions, I have experienced many of the sutle ways that minority candidates get screened out. Managers/supervisors look at the applicants name, where they went to school, and places where they have worked to overlook very qualified candidates. I've even covered up names to see whether the candidates were being screened out because the name was too ethnic. It's happend many times. Although we've changed the laws, the perceptions are harder to turn around.
Shankar Vedantam: Thanks for your comment.
I found this article to be absolutely fascinating. Is the antidote to our implicit prejudices simply to expose ourselves to experiences that run counter to our biases? It seems that may lead one to choose unwise behaviors.
Shankar Vedantam: Exposure to counter-stereoytpes does seem to change these attitudes, according to the research. But other factors play a role as well. One study found, for example, that women who attend an all-women's college had lower bias scores on gender tests than women who went to co-ed colleges.
Of course there is a right and wrong interpretation of such findings: No one would suggest that all women should go to all women's colleges. But it does raise questions about why we see differences, and what can be done about them.
Finally, I should mention that there are many people who feel that addressing prejudice is not primarily about changing individuals but changing societal structures. As the story notes:
"Lani Guinier, President Bill Clinton's unsuccessful nominee for assistant attorney general for civil rights and now a professor at Harvard, is a fan of Banaji's work. But she says she worries the IAT will usher in superficial changes. The decor on the walls might be important, she says, but it isn't the real problem. "I worry people will think you can depress [implicit bias] scores through sporadic interventions," she says. "That will channel our efforts toward reform in potentially modest ways that don't fundamentally change the cultural swamp in which we are living."
One of the questions I had while reading the magazine was whether humans can ever be free of bias, and if so, could the test reflect that? It doesn't seem possible, and maybe the best approach we can hope for as humans is to acknowledge our biases and try to counterbalance them, rather than blithely assume we are/can be free of bias. That would be especially helpful in setting public policy.
I was really blown away by this article, especially the reluctance of the lawmakers and advocates to identify themselves and the description of "illusory correlation." I always knew people were prone to judge other groups based on the actions of one or two members of that group, but never knew it had a name.
Thanks for illuminating the subject.
Shankar Vedantam: One of the questions I had in the story that I couldn't address was whether implicit bias scores have reduced over the last quarter century. Obviously, the IAT was not around 25 years ago, so there is no way to answer the question scientifically. But I think the consensus is that biases have reduced - as American culture, laws and society have changed. Does this mean that there will be a day when these biases no longer exist? I don't know.
Certainly, the tests have prompted many people to question their own assumptions and beliefs. And as I said at the start, that moment of conscious, internal reflection is extremely powerful in fighting bias - certainly far more powerful than any outside pressure.
Takoma Park, Md.:
Regarding the "not biased" comment, it seems to me that all human behavior is by definition biased. Fairness and justice are human constructs -- a standard we would like to aspire to but almost never achieve.
All of us are biased in some ways, and do well to both recognize our biases and work to steer against them.
Shankar Vedantam: I think the research uses a definition of bias that most people can agree with. It merely asks, how well do your implicit attitudes line up with your explicit beliefs? Most people believe they can associate black faces with positive words as easily as they do whites. The data reveals, however, that this is not the case. At a very simple level, that's a bias. What people do with the information is really up to them. Some people may want to get their internal and explicit attitudes to align better. Others may be perfectly happy being the way they are.
I took the IAT on the Harvard site a couple of weeks ago, and found myself slightly favoring whites over blacks (I am black). It's disappointing, but not too surprising. Based on the research you report I am all for the "PC" approach -- more positive associations to jog our subconscious, such as a disproportionate number of minorities shown in TV programs and commercials in trusted rather than distrusted positions. Other tests I took: strongly for Kerry over Bush, no distiction between Jewish and Christian images, moderatly preferring thin people to fat ones, and mildly favoring comedy over drama (only the last test surprised me!)
Shankar Vedantam: Thanks for your comment.
The perceptions one gets whihc may influence the results are driven by what is fed to them on a daily basis. Think of the last 10 black people you have seen on TV for more than one minute. Compare with the last 10 white people you have seen. What stereotypes do these portrayals feed?
Shankar Vedantam: I think one of the issues I couldn't really explore in my story was the role of the media in creating such attitudes. Jerry Kang of UCLA and Georgetown is especially interested in this issue - he believes the IAT research should be used to change how the FCC regulates the media. I should point out that many people believe that would be going too far; that the research is not at an advanced enough state to influence policy.
I know this is not directly related to your article, but part of it reminded me of the prejudices people discovered existed in elementary school texts back in the 1960s. In explaining what looked nice and what looked ugly, the text books would show drawings of sharp looking boys and girls versus ugly boys and girls. Of course, some of us can't help we were born ugly. I remember growing up playing alone at recess because I looked like one of those pictures. It is true: sometimes we are taught prejudice, and sometimes we didn't even realize it at the time.
Shankar Vedantam: Thanks for your comment.
I am AMAZED by the people who aresearching for ANY ay to make this test invalid.
Most likely the very people who hold bias are REFUSING to acknowledge the possibility.
Shankar Vedantam: I actually think it is good that people are questioning the research and holding it up to scrutiny. No respectable science has ever been hurt by tough scrutiny. I think Mahzarin Banaji, Tony Greenwald and Brian Nosek would be the first to welcome questions.
Other psychologists, for instance, are coming up with ways to improve the tests - the efficacy of these methods is still being debated, but it is entirely possible that improvements can be made. This is science, and science always admits the possibility of error, and the potential for improvement.
The study and your story were so interesting and exciting to read about! But I find myself wondering, as we throw around the word "bias", how it would be defined by the creators of this study? It seems to have negative implications across the board, but does bias necessarily mean someone is anti-gay/black/etc. or just has certain associations that may be stereotypical toward a particular group?
Thanks again for drawing some attention to this issue, and for an informative discussion!
Shankar Vedantam: I think the definition of bias in the tests was addressed earlier. But you raise an important issue - bias, as measured by the tests, does not necessariy have to imply a negative connotation. You can create implicit association tests measuring a bias for Pepsi over Coke, Bush over Kerry etc. Obviously, there is nothing right or wrong about those choices.
The research shows that in many areas, the IAT is really no better than explicit questions - people have no problem saying whether they are for Bush or Kerry, and those answers line up well with their actual preferences. It's only on sensitive matters that get at issues of social desirability - where people perceive there is a "right" answer and a "wrong" answer - that the IAT outperforms explicit questioning. Basically, asking people whether they have a preference for whites over blacks is a total waste of time!
I tried a few of the tests after reading the article, and results were predictable. The slight bias for white over black (ugh), a preference for free will over determinism, and, interestingly, a tendency to link women and science. However, it struck me as I took the tests that I was getting "better" at them. That there's a way to train your brain to ignore any subconscious correlations, which one gets better at with successive tests. Does that make sense to you? What might it imply for the results if it does?
Shankar Vedantam: For obvious reasons, most people who take the tests on the Harvard website are trying to "beat the test." The fact that the overall results show widespread bias suggests, I think, that people are not very good at beating the tests. I hear that Brian Nosek, who has taken hundreds of these tests while he helped to develop them, has gotten very good at beating the tests. But I am not sure his results are representative.
So if you take a test and find you don't have a bias, consider the possibility that you may not have a bias! It is definitely not all a matter of dexterity.
Fabulous article! I'm a social scientist and I really enjoyed the careful job you did in presenting this research.
Shankar Vedantam: Thank you.
I took the test online a couple of years ago, and it was interesting. Like the creators of the exam, I tested out as mildly prejudiced. One variation I'd love to see would be to test reactions to how people are dressed, e.g. the gang look, and do a threatening/nonthreatening test. E.g. put a picture of a black guy in a business suit and an Asian or white in gang clothes, amongst other pictures. I'd love to have a test like that, since, as a petite woman travelling on a bus alone at night in diverse San Francisco, I went more by attitude and apparel than skin color. But I wonder what my immediate innate reaction really is.
Anyway, great article!
Shankar Vedantam: Thank you. One of the best features of this research is that it gets lots of people to think of new tests. Do consider passing along your ideas along to the folks at Harvard and the other schools! The next time we write an article about this, it could be about a test you've designed.
I read with avid interest -- as well as wry amusement -- the Magazine's article on implicit bias. As an African American, who has sensed some bias by lenders in my attempts to purchase a home a few years ago, I had no doubt that this type of bias existed, but, of course, it's difficult to prove in a concrete way.
Of particular interest to me were the results indicating the negative ways in which African Americans perceive themselves. I well remember reading about the famous doll study conducted by Dr. Kenneth Clark, wherein African American children were asked to choose between black dolls and white dolls. Without fail, each of the children chose the white dolls because those dolls were perceived as better (prettier, cleaner, etc.). Some of the children even cried after realizing the message they conveyed by their choices. Given the times in which these children lived (where Jim Crow laws and other white supremist mores were reinforced by custom and law), I wondered if in today's climate, if the negative images sometimes reflected by the Hip Hop culture and other cultural influences factored in African Americans' negative implicit biases of themselves.
I also wondered if classism colored -no pun intended] the implicit biases that may currently exist in the way some prominent African Americans may see members of the African American "underclass."
Shankar Vedantam: Thanks for your comment.
As for your last point, I think the researchers have indeed tried to tease apart the effects of social class from race bias. Different studies have shown that race and social class each trigger separate biases - it is entirely possible, I suppose, that such biases can reinforce one another.
What can I do tomorrow, after taking the IAT, to catch this unconscious process? Is awareness enough? Is the time consumption of trying to not use hueristics not "worth it?"
Shankar Vedantam: I certainly think the most valuable component of the research is in its ability to get people to think about themselves and their opinions. Of course, it is unlikely we can always be conscious of our thought processes and behaviors - it would take us a half hour to brush our teeth in the morning and we would always be late! But it is not perhaps unreasonable to demand more deliberation when our decisions directly influence the lives of others.
I have a preference to hang around people who share my cultural tastes and values. If most of them are white, does that make me biased?
Shankar Vedantam: No.
I think there are no simple ways to determine how biases get formed. Having close friendships with people from different backgrounds does seem to predict lower biases. Does that mean that you have to distance yourself from people of similar backgrounds and tastes? Obviously not. This research speaks to greater inclusion, not exclusion.
Upper Marlboro, Md.:
Are the people responsible for this fascinating test serious when they hope that it won't be used to punish and/or hurt people? Of course it will! People will be forced to take the test on a smattering of areas in order to get a job; racial bias, homosexual bias, liberal or conservative bias, gender bias; the mind boggles at the possibilities.
Of course, there will be correct and incorrect biases inferred, depending on who is administering the test. One's religious training may censure certain activities and lifestyles. Adherance to that religion may require that one abstain from certain lifestyles or behaviors. However, one's religion may also require that the adherent reach out to those that participant in these sinful behaviors in order to save them from themselves. The bias against the behavior WILL, in fact, MUST show up on a test. However, the behavior of the individual may be loving, inclusive, and healing. Guess what, the bias will preclude the religious adherent from being accepted in many companies that might start using a variant of the bias test.
Scary, scary, scary... At the same time, I find it absolutely fascinating and do wish to see more information on coming research and analysis of their findings thus far.
Shankar Vedantam: You raise questions that have troubled a lot of people. As the story notes, there are also many others who feel the tests won't be used widely enough. Here's the section of the story that discussed this debate:
THE APPARENT ABILITY OF THE IMPLICIT ASSOCIATION TEST TO DETECT HIDDEN ATTITUDES AND PREDICT BEHAVIOR has raised questions about its potential uses. Might it predict, for example, which police officers are likely to mistakenly shoot an unarmed black man? Should such tests be used to cull juries of people with high bias scores? Might employers use such tests to weed out potential racists? Might employees trying to prove discrimination demand that their bosses take bias tests?
The problem, Banaji says, is that all those uses assume that someone who shows bias on the test will always act in a biased manner. Because this isn't true, Banaji and her colleagues argue against the use of the IAT as a selection tool or a means to prove discrimination. Banaji says she and her colleagues will testify in court against any attempt to use the test to identify biased individuals.
Another reason to limit the IAT's use: Research has shown that individuals who are highly motivated can successfully fool the tests by temporarily holding counter-stereotypes in their minds. (Other attempts to fool the tests -- such as consciously attempting to respond faster or slower -- tend to change results only slightly, if at all, Banaji says.) Banaji hesitates to perform real-world studies that examine, for instance, whether police officers with the most bias are the most likely to shoot an unarmed suspect in an ambiguous situation, because the results of such studies could be subpoenaed and used in lawsuits against police departments. The researchers say they want to keep the focus of the tests on public education and research. They are wary of having the tests used in lawsuits, because if people feared their results might one day be used against them, they would be hesitant to use the tests for personal education.
Banaji says she is keenly aware that psychology has a long history of tests -- starting with the "lie-detector" polygraph -- that have been hyped and misused. Personality tests that lack the rigor of the Implicit Association Test have been widely used by companies in employee training and even hiring. Atop Banaji's desk at work is a bust of a human skull marked with different brain areas once thought to be responsible for different emotions: a representation of the discredited science of phrenology. The bust is a daily warning about the many failed ways science has promised to unlock people's minds and personalities.
But even as Banaji hears from critics who say the Implicit Association Test, which is not patented, will get misused, some proponents tell her it would be unethical not to use the test to screen officials who make life-and-death decisions about others. One test in a British jail showed that, compared with other criminals, pedophiles had implicit associations linking children and sexual attraction. Should such tests be used to determine which pedophiles have been rehabilitated and should be eligible for parole or, more controversially, as a law enforcement tool to evaluate which individuals are at risk of harming children?
"People ask me, 'How do you sleep at night knowing this can be misused?'" Banaji says. "Others ask me, 'How do you sleep at night knowing this won't be used fully?'"
Thank you very much for writing this article. It pulled together a lot of great information that at some level every one knows. I haven't taken the test but I suspect the areas that I will be negatively biased. Are you planning on doing any follow up articles? I just read the article two minutes ago and saw you were available at 1 p.m. today. Consequently, I am ill prepared for discussion. However it was a WONDERFUL article. Thank you very much for giving it to us.
Shankar Vedantam: Thanks very much.
I ran across this test a while ago on the Web, and took it for race and gender bias. My impression was that the results were very accurate.
Thanks for your article -- it went a long way toward explaining the basis for the test and why one should not feel guilty for one's cultural programming. In today's world it is imperative that we maximize our human ability to modify that programming with rational behavior.
I have recommended the test to others in their search for self-knowledge. Now I have a really good reference to explain why!
Shankar Vedantam: Thanks.
No question, but a comment. I am a psychology professor at Northern Virginia Community College. I use the IAT in my Social Psychology class. The students do not have to tell me their scores -- they have to write a paper on the process of assessing implicit attitudes. However, they always tell me their scores in great protest that this system is rigged. It makes for a great discussion on internal prejudice vs. outward discrimination. Thanks for the article!
Shankar Vedantam: Thanks.
How would you teach people to become aware of these biases in their everyday interactions? How can they be sure that these biases don't impact managerial decisions?
Shankar Vedantam: I think the research speaks to the importance of developing criteria that are as objective as possible. When musicians at orchestra auditions played without judges being able to see them, the number of women hired to orchestras dramatically increased. As I've said before, there is something about findings like this that seem, at one level, hard to believe. We all believe so deeply that we can evaluate others according to the content of their character, rather than the color of their skins, and are shocked to find that might not always be the case.
Max Bazerman at Harvard Business School suggests reducing the importance of interviews in the hiring process, since he says they are largely predictors of whether a manager likes a candidate, not whether the candidate is the best person for the job. Beyond the issue of discrimination, this gets at a notion that some people are exploring - by not paying attention to managerial biases, and allowing talented candidates to slip by, are companies paying a "stereotype tax?" I toss this out only as a provocative idea, not one that has as yet been fully veted.
Very thoughful article... thank you.
I would be interested to know if the bias test has been given to children, particularly, early elemantary school age, perhaps even preschoolers to determine when bias begins to develop.
Shankar Vedantam: Previous question - that should have been vetted, not veted.
The question about children is excellent. I wish I had had the space to get into this in the story. There is a lot of work being done that shows biases are evident in young children. What is not clear, however, is what should be done about it. Again, the research is in too early a stage to suggest policy implications, but it is certainly intriguing.
Have the authors of the study thought about adapting the test so that people who are blind, or who have impaired motor function can take it?
Shankar Vedantam: An excellent question. I don't know the answer. The hurdle is clearly technological. The IAT ought to be able to measure associations of any kind, visual, audio, written text etc.
Thanks for your article.
Considering the difference in the assessment of society between Blacks and Whites, I've often wondered whether I just don't understand (being White and male).
On the other hand, I do know that Black friends of mine have attributed some situations to racism that just plain wasn't racism, but were attributed to just part of being human (e.g., just because a person is rude to a Black person -- could just imply that the person is equally rude to everybody).
This is an issue that Whites (or males) tend to NOT address whereas Blacks (or females) tend to over address.
Shankar Vedantam: Thanks for your comment.
I read the article and was very fasinated about the topic. Where can I go to find additional information on this?
Shankar Vedantam: As I said before, here's a link that answers many frequently asked questions. https://implicit.harvard.edu/implicit/demo/faqs.html
Shankar Vedantam: That's all we have time for today. Thank you for all the insightful questions. It's a provocative topic, and I'm certain we have not heard the last word on the subject.
Before closing, I'd like to admit to a strong bias of my own. I hope the Philadelphia Eagles win the Super Bowl!