Bias

Implicit Bias: The Hidden Biases of Good People (Summer 2017) Class ID: 1929

Study leader:  Lloyd Stires (lstires@auxmail.iup.edu)

Osher Ambassador:  Karen Nickell (ktnickell@comcast.net)

Articles and videos available on the internet:

Badger, E.  (2016).  We’re all a little bit biased, even if we don’t know it. https://www.nytimes.com/2016/10/07/upshot/were-all-a-little-biased-even-if-we-dont-know-it.html?mcubz=1

Fiske, S. T.  (2010).  We might be more racist than we think we are. http://www.alternet.org/story/148871/we_might_be_more_racist_than_we_think_we_are

Kang, J.  (2016).  Implicit bias.  This is an excellent series of seven videos–a preface and six lessons.  You can start with the preface (below) and follow the links to the other six videos,  https://www.youtube.com/watch?v=R0hWjjfDyCo

Mullainathan, S.  (2015).  Racial bias, even when we have good intentions. https://www.nytimes.com/2015/01/04/upshot/the-measuring-sticks-of-racial-bias-.html

Scientific American Frontiers: The Hidden Prejudice. https://www.youtube.com/watch?v=3Nj-MjBc-xQ

Tsipursky, G.  (2016).  Fact-checking Clinton and Trump is not enough. http://theconversation.com/fact-checking-clinton-and-trump-is-not-enough-67506

Wise, T.  (1999).  Famous last words: Exploring the depths of racist conditioning. http://www.timwise.org/1999/04/famous-last-words-exploring-the-depths-of-racist-conditioning/

Postings from my blog, Thinking Slowly:

On the Implicit Association Test. http://thinkingslowlyblog.blogspot.com/2014/02/the-implicit-association-test-racial.html

On Old-Fashioned Racism. http://thinkingslowlyblog.blogspot.com/2013/01/old-fashioned-racism.html

On Modern Racism. http://thinkingslowlyblog.blogspot.com/2013/11/symbolic-racism-and-gun-ownership.html

On race and voter suppression. http://thinkingslowlyblog.blogspot.com/2014/10/voter-id-and-race-part-1.html

On teaching bias.  http://l-stires.com/thinking-slowly/teaching-bias-part-1/ and http://l-stires.com/thinking-slowly/teaching-bias-part-2/

On race and physical size:  http://l-stires.com/thinking-slowly/why-bad-dudes-look-so-bad/

On race and welfare:  http://l-stires.com/thinking-slowly/what-does-a-welfare-recipient-look-like/

On “dog whistle politics:”  http://l-stires.com/thinking-slowly/a-darker-side-of-politics/

Suggestions for further reading:

Banaji, M. R., & Greenwald, A. G.  (2013).  Blind Spot: Hidden Biases of Good People.  New York:  Delacorte.  The book from which I cribbed my subtitle, this is the most accessible book about implicit bias.

Gladwell, M.  (2005).  Blink: The Power of Thinking Without Thinking.  New York: Little, Brown and Co.  Recommended with reservations, since Gladwell uses anecdotes to exaggerate the usefulness of fast thinking.

Haney-Lopez, I.  (2014).  Dog Whistle Politics.  New York:  Oxford University Press. How implicit biases are exploited in political advertising.

Kahneman, D.  (2011).  Thinking, Fast and Slow.  New York:  Farrar, Straus and Giroux.  This best-seller is an excellent summary of the implications of dual-process theories of thought.

Tesler, M.  (2016).  Post-Racial or Most-Racial?  Chicago:  University of Chicago Press.  Original research on the effects of racial attitudes on political decisions. More difficult reading than the others.

Session 1 (July 6)

Dual Process Models of Thought

25 years ago, most psychologists believed that human behavior was guided mostly by conscious thoughts and feelings. Now, the majority of cognitive and social psychologists agree that much of human judgment and behavior occurs without conscious thought.

Research on implicit bias only makes sense within the context of dual process theory, and implicit bias research constitutes some of the strongest evidence for dual process theory.

Following Daniel Kahneman (2011), we’ll refer to the two cognitive systems as System 1 (fast) and System 2 (slow). They could also be called automatic and controlled processes, or the impulsive and reflective systems.

System 1 processes accomplish many useful things: proprioception (our sense of location and movement), depth perception, face recognition, language processing, etc.

The defining feature of System 1 processes is their autonomy.

  • Their execution is rapid.
  • Their execution is automatic when the triggering stimuli are encountered.
  • They are spontaneous; they do not depend on input from System 2.
  • They do not require conscious attention or cognitive effort.
  • They can operate in parallel without interfering with one another or with System 2 processing.

System 2 contrasts with System 1:

  • It is slow.
  • It is the focus of our awareness.
  • It requires attention and effort; usually only one System 2 process can be executed at a time.
  • System 2 processing is language and rule based, i.e., 17 x 24.
  • When we think of ourselves (our consciousness, identity), we are thinking of System 2.

The most critical function of System 2 is to override System 1 when necessary.

When solving problems or making decisions, System 1 is quick but not always accurate. System 2 is able to do detailed analyses in situations of importance by engaging in hypothetical reasoning—imagining alternative responses and choosing the best one.

Cognitive miser hypothesis.

System 2 has an important flaw that makes it less than rational: We are cognitive misers.  System 2 gives us greater accuracy, but at the cost of requiring attention and concentration, which is often experienced as aversive (unpleasant).

The human default is the shallow processing of the cognitive miser.

Things that System 1 does.

1.       System 1 automatically forms associations between concepts.

2.       It is influenced by the salience of information.

Salient events are emotionally interesting, concrete (rather than abstract), and           are things that happened nearby or recently.

Priming is anything that temporarily increases the salience of a stimulus.

3.       It is vulnerable to framing effects, i.e., “90% survival rate” vs. “10% mortality.”

4.       System 1 is biased to believe and confirm whatever information it is given.

In order for System 1 to understand a statement, it must believe it. System 2               processes are required to “un-believe” a statement.

The History of the Measurement of Prejudice

Attitude = a favorable or unfavorable evaluative response to a person, object or event.

Three components (ABCs) of an attitude:

  1. Affect (emotion, feeling)
  2. Behavior (or behavioral intention)
  3. Cognition (beliefs)

Prejudice = a negative affective response toward some socially defined group of people, or toward any person merely because he or she is perceived as a member of that group.

Discrimination = an overt behavior which is harmful to the interests of a socially defined group, or of any person because he or she is perceived as a member of that group.

Stereotype = a generalization about a group of people in which certain traits are assigned to virtually all members of the group, regardless of actual variation among the members.

Racism (and sexism) reflect both individual behavior and institutional practices.

Individual racism = prejudice, discrimination and stereotyping by specific persons.

Institutional racism = cultural practices that are harmful to the interests of a group, but that operate independently of individual behavior.

Old Fashioned Racism (OFR)

The first attempts to measure prejudice (in the 1920s and 1930s) were surveys that asked direct questions and tabulated the answers.

These surveys found strong majority support for the ideology of white supremacy:

  1. support for social distance between blacks and whites.
  2. belief in the biological inferiority of blacks.
  3. support for social policies of segregation and institutionalized discrimination.

These beliefs are sometimes called old-fashioned racism.

For a variety of reasons, OFR declined sharply during the last half of the 20th century. There is still a hard core of old-fashioned racists, somewhere between 5% and 20% depending on the question.

Measures of OFR are transparent and may underestimate prejudice.

The surveys may simply show that it has become less socially desirable to express racial prejudice to an interviewer

Jones & Sigall (1971)–the bogus pipeline

Those in the bogus pipeline group are attached to an apparatus which they are falsely told measures their responses to a test of stereotyping. They are asked to guess what the machine says.  The control group fills out the scale under standard conditions.

The bogus pipeline group reports significantly more negative ratings of “Negroes.”

Implication: Bogus pipeline condition is the more honest response. Those who take the survey under standard conditions are not responding candidly.

Needed: Less transparent measures of prejudice.

Session 2 (July 13)

Measures of racial attitudes

Transparent (continued)

  • Anti-black affect (the “feeling thermometer”)
  • Stereotypes

Less transparent

Modern Racism (also called racial resentment)

Principle-implementation gap: White Americans increasingly reject racial injustice in principle, but consistently oppose any measures intended to eliminate the injustice.

From this we might infer that the way many Americans express their racism is by opposing policies intended to help Black people achieve equality.

Modern racism, aka racial resentment = prejudice revealed in subtle, indirect ways, such as opposing social policies intended to reduce racial inequality, such as affirmative action and open housing laws.  Among the themes associated with modern racism are:

  1. African-Americans no longer face much discrimination.
  2. Their disadvantages are a result of a poor work ethic.
  3. They are demanding too much help, or too fast.
  4. They have gotten more than they deserve.

It may be argued that modern racism is really just principled conservatism—traditional values that are independent of race, such as the importance of hard work and the belief that no group should get favored treatment.

The principled conservatism argument is undermined by the fact that Americans are more opposed to affirmative action for Blacks than for women.

Summary:  Some of the people who oppose policies to help Blacks do so because they are principled conservatives; some of them oppose policies to help Blacks because they are racially biased, and some are both conservative and racially biased.  Conservatives must explain why conservatism and prejudice are so strongly associated with one another.

The Modern Racism Scale has turned it to be very useful.  Modern racism is a much stronger predictor of discrimination against Black people than Old Fashioned Racism. It is extensively used in political polling, where it is usually called racial resentment.

Unobtrusive measures

Unobtrusive measures study discriminatory behavior in a context in which participants are not aware that discrimination is being measured. This typically involves deception.

Example:  Helping behavior–Both Blacks and Whites show same-race favoritism. Whites discriminate against Blacks more than Blacks do against Whites.

Audit studies–Pairs of White and Black testers with identical resumes, selected and trained to appear and behave similarly on all characteristics other than race, are sent out to apply for jobs, rent apartments, etc. White applicants are called back twice as often as Black applicants.

Measures of implicit bias

The Implicit Association Test (IAT)

The Race IAT asks the question: Will you be faster in associating Black faces and pleasant words? Will you be faster to associate White faces with pleasant words? Or will you be equally fast at both?

68% of White people show an automatic preference for White while only 14% automatically prefer Black and 18% have no preference. Among Black respondents, about 40% prefer White, 40% prefer Black, and 20% have no preference. The preferences of Asian and Hispanic respondents are similar to those of Whites.

The IAT illustrates the principle of unconscious association. It is an automatic response of System 1. It is difficult for System 2 to override System 1 within the time constraints imposed by the task.

The Project Implicit website contains IATs based on nationality, religion, sexual preference, age, weight, disability status, race and gender stereotypes, preferences among political candidates, etc.

On the Race IAT, men show a stronger automatic preference for White than women, older and younger people are more pro-White than middle-aged people, and conservatives are more pro-White than liberals.

In general, Whites, men, older people and conservatives have stronger preferences for socially dominant groups.

Session 3 (July 20)

Critique of implicit bias #1:  Unreliability of the IAT

The reliability of the Implicit Association Test has been questioned.  A test is reliable if it gives you the same result every time you take it.  The IAT exhibits a practice effect in which scores move toward “no preference” with repeated testing.

For this reason, the IAT should not be used for diagnostic purposes, i.e., to decide whether to hire a job applicant.  The IAT tells us more about society than about any specific individual.

“If my IAT showed an automatic preference for White, does this mean I am prejudiced?”

The answer is a qualified “no.”  The IAT measures implicit bias, a System 1 response. Prejudice is a conscious System 2 response in which a person expresses an attitude of dislike of another person or group. With conscious effort, System 2 may be able to override our automatic behavioral tendencies.

However, implicit bias is positively related to other measures of prejudice, but the correlation is small (r = +.12).

And implicit bias is also is also predictive of discrimination against Black people. The correlation between IAT scores and discrimination is of moderate size (r = +.24).

Example: White participants were interviewed twice, by a Black woman and a White woman. People with IAT scores that indicated a preference for White:

  • showed less friendly nonverbal behavior toward the Black interviewer (compared to the White interviewer).
  • sat further away from the Black interviewer.
  • were rated by the Black interviewer as less friendly.

Studies such as this help to establish the validity of the IAT. A valid test is one that measures what it claims to measure (rather than something else). Implicit bias should result in a tendency to discriminate against Black people, and it does. Implicit bias is a better predictor of discrimination than more transparent measures of prejudice.

What are the causes of implicit bias?

Personal experiences during childhood

Mahzarin Banaji, gave the IAT to 6-year-olds, 10-year olds and adults. She found that the 6- and 10-year-olds showed own race preference comparable to adults.

Familiarity

Previous research suggests that repeated exposure to a stimulus (a word, a photo) causes us to like it better. If young children spend more time interacting with people of their own race than other races, and if these interactions are usually pleasant, they should develop an implicit preference for people of their own race.

Preference in infants is measured by time spent looking at a stimulus. Consistent with with familiarity hypothesis, newborns show no preference, but 3-month-old White infants show a preference for photos of people of their own race.

Critique of implicit bias #2:  Familiarity

One challenge to the validity of the IAT is that implicit bias is nothing but familiarity. I believe that, while familiarity contributes to implicit bias, it’s not the whole story. If it were, most Blacks would show an automatic preference for Black, and they don’t.

Teaching implicit bias

Alison Skinner reported two studies with pre-school children. In the first study, children were shown a video of an adult female actor interacting with two adult female targets, who wore either a red or a black shirt. The actor showed a non-verbal preference for one target over the other—smiling, leaning in toward one and scowling, leaning away from the other. Later:

  • The children said they liked the preferred target better.
  • They were more willing to give her a gift.
  • They were more likely to imitate her behavior. 

The children “caught” the adult actor’s preference.

In a second study, children were shown the same video, but were introduced to two new women, said to be members of the red group or the black group and friends of the previous targets. The actor’s preference for one of the targets generalized to other members of her group.

Session 4 (July 27)

A similar study was done involving television. The researchers recorded 90 10-sec segments from 11 popular TV shows in which a White character interacted with a White or Black target. Judges rated how positively the unseen target was being treated. The results showed evidence of nonverbal racial bias on TV.

  • When the 11 programs were scored for the amount of racial bias, it was found that those who watch the more biased programs regularly showed a greater preference for White on the IAT.
  • Pro-Black and pro-White tapes were created from these segments. Participants who were shown the pro-White tape showed a greater preference for White on the IAT than those who were shown the pro-Black tape.
  • Those who were shown the pro-White tape also showed more implicit bias when it was measured using sublimal priming.

Media exposure

We now spend almost 8 hrs. per day on media exposure, television accounting for approximately half of that time.

George Gerbner’s cultivation theory: TV is a centralized system of story-telling that shapes or cultivates the way we think and relate to one another. When the TV view of an issue conflicts with social reality, heavy TV viewers show greater acceptance of the TV view than moderate or light viewers.

Example: Mean world syndrome—Heavy viewers

  • overestimate the frequency of crime.
  • are more personally fearful of crime.
  • support greater use of force by the police.

Heavy TV watching promotes primarily conservative attitudes.

Crime

Researchers have compared the % of Black, White and Hispanic criminals and victims on local TV news to actual crime statistics. Black and Hispanic criminals are overrepresented, and White criminals are underrepresented. On the other hand, White victims are overrepresented and Black and Hispanic victims are underrepresented.

Poverty

Surveys have shown that Americans estimate the percentage of people below the poverty line who are Black at 50-55%, while the actual number is 29%. Martin Gilens analyzed photos of poor people in news magazine stories about poverty and found that 62% of them were Black. 65% of people identified as poor on ABC, CBS, NBC nightly news programs were Black. Compared to people with more accurate perceptions of reality, Whites who think most poor people are Black are:

  • More likely to blame poor people for their poverty.
  • More opposed to welfare.

Critique of implicit bias #3: Cultural awareness

The argument has been made that automatic preference for White on the IAT does not reflect individual prejudice but is nothing more than awareness of American culture. We learn prejudice in much the same way that we learn other norms of the society, or even the English language. This view suggests that we are not to blame for having an automatic preference for White.

Problems with this argument:

  1. The cultural awareness explanation implies that Americans should show greater uniformity of IAT scores than is found in practice.
  2. When American cultural traditions (like prejudice) are to harmful to other people, do we have a moral obligation to avoid such beliefs?

Implicit Bias in the Legal System

A 2016 Washington Post analysis found that Black Americans are 2.5 times as likely to be shot and killed by police officers as White Americans. Unarmed Blacks are 5 times as likely to be shot and killed.

“Shooter Bias” Studies

White participants play a computerized video game. On each trial, they see a photograph of a young man in a realistic setting. Half of the men are White and half are Black. Half of the men of each race are holding handguns and the others are holding harmless objects, i.e., a cell phone, a soda can. They press a different key for “shoot” or “don’t shoot.” They are required to respond rapidly (850 milliseconds).

Speed: If the target is armed, participants make the correct decision to shoot more quickly if he is Black than if he is White. If he is unarmed, they make the correct decision not to shoot more quickly if he is White.

Errors: The greatest number occur as a result of shooting an unarmed Black man, and the fewest errors occur when shooting an armed Black man.

Followup studies show that:

  • The magnitude of the shooter bias was the same for Black and White participants.
  • The shooter bias is affected not only by race by also by target stereotypicality—that is, whether the target has dark skin and facial features associated with African-Americans.
  • Both police officers and civilians exhibit the shooter bias, but the police are faster overall and make fewer errors.
  • Both college students and police officers given repeated practice at the task were able to improve both their speed and their accuracy.
  • Self-reported personal beliefs about whether Blacks were dangerous, violent and aggressive did not predict shooter bias, but estimates of the % of the population holding these beliefs did.
  • Exposing participants to a newspaper article about a Black (vs. a White) criminal increased shooter bias.
  • In the original experiment, half the Black and half the White targets were armed. If you increase the % of Black targets who are armed, shooter bias increases. If you the % of Black targets who are armed, there is less shooter bias.

Shooter bias can also be demonstrated using priming. Participants who are primed with Black faces are quicker to identify pictures of weapons than those who are primed with White faces. It also works in reverse. Priming participants with photos of weapons results in increased visual attention to Black (vs. White) faces.

Threat

The most likely explanation for shooter bias is that people find young Black men to be threatening. When police officers are shown photographs of faces and asked to judge whether they look like criminals, their judgments are affected by both race and stereotypicality. Even White faces are judged as more criminal if they have African-American facial features. Black men convicted of murder are more likely to receive the death penalty if they have dark skin and stereotypically Black facial features.

Both college students and police officers estimate Black teenagers to be older and less innocent than White or Latino children of the same age. When participants are shown pictures of faces of Black and White men matched for height and weight, Black men are judged taller, heavier, more physically fit and more dangerous, and police are said to be more justified in using force against them.

Session 5 (August 3)

Critique of Implicit Bias #4: Rational Behavior

Recall that one explanation for both the shooter bias and the IAT is that Whites fear Black people more than they fear other White people.

Some critics have suggested that, since Black people are guilty of bad behaviors, a preference for Whites on the IAT and the results of the shooter task do not reflect prejudice, but rational behavior. For example, Justice Department statistics suggest that a person is 7.49 times more likely to be robbed by a Black than a White stranger. Their mathematical analysis suggests that it is rational to flee a Black man but not a White man.

Problems with this argument:

  1. The crime data are problematic, since Black perpetrators are more likely to be arrested and convicted than White perpetrators.
  2. By their analysis, the probability of being robbed by a Black stranger is still less than 1 in 100. Can racial bias (or shooting someone) be justified by such a low probability event?

Implications

On those rare occasions when a police officer is brought to trial for shooting a Black man, the officer usually says: “I feared for my life.” Acquittal is almost automatic. What is the moral status of that fear? Can we convict someone of a crime for an unconscious impulse which he or she was unable to resist?

The reasonable person test: What would a reasonable person do in this situation? We know that the average person is biased by race. If we assume the average person is a reasonable person, then implicit bias is reasonable and the officer is innocent. On the other hand, if we say it is not reasonable to assume that a person is dangerous just because he is Black, we are holding the police to a higher standard than can be expected of others. In reality, juries hold them to a lower standard.

The IAT and Voting

A study of the 2008 presidential election (Obama v. McCain) showed that preference for White on the IAT predicted an intention to vote for McCain, even after the effects of conservatism and other types of prejudice had been statistically removed. Similar results were found in the 2012 election.

Dog whistle politics refers to political messages that use coded racial appeals that automatically activate negative stereotypes of Blacks related to crime, welfare, etc. A dog whistle operates on two levels. It is not always detected and can be plausibly denied. Nevertheless, it can result in strong automatic reactions. For example, Hillary Clinton’s repeated use of the phrase “hard working Americans.” Studies show that people associate “Americans” with White people, and the adjective “hard-working” may imply that minorities are not hard-working.

Social psychologists have noticed that coded racial appeals operate in a manner similar to subliminal priming, and may serve as a stimulus to implicit bias.

Recall that photos of dark-skinned Blacks elicit more implicit bias that photos of light-skinned Blacks. One group of researchers examined still photos of Obama used in McCain campaign ads using a device which electronically measured brightness. Darker-skinned photos were used increasingly as the campaign progressed, especially in ads linking Obama to crime. They then did two priming studies showing that darker images of Obama produced more implicit bias than lighter images.

Strategies for Reducing Implicit Bias

Making people aware of their biases is not enough. If implicit bias is an automatic System 1 response that occurs before System 2 has an opportunity to think, it could be argued that reducing it will be difficult and maybe impossible. Nevertheless, two strategies have been suggested:

  1. Change-based interventions—changing the associations underlying our implicit attitudes.
  2. Control-based interventions—leaving the associations intact, but reducing their effects on behavior, either by through self-control or by taking the decision out of their hands.

Control-based interventions

The most extreme control-based interventions are those that take the decision out of the hands of the person who might be biased. An example is the “Sentinel,” a four-wheeled robot to be used by police to reduce the likelihood of violent confrontations during traffic stops.

In some cases, it may be possible to “blind” decision makers to the group membership of those they are evaluating, as in the case of symphony orchestra auditions in which the candidate performs behind a curtain. This solution has limited applicability, but should be used whenever possible.

When a decision makers are biased, it may be possible to replace them with an algorithm, or mathematical formula. However, if the algorithm is constructed through a biased process, you will merely have substituted institutional racism for implicit bias. An example is COMPAS—an algorithm that predicts whether criminal defendants are likely to reoffend. It has been shown that Black defendants are almost twice as likely to be mislabeled as likely to commit future crimes.

What about self-control? Can we use System 2 to override incorrect decisions made by System 1? One research group manipulated a control strategy they called an implementation intention—an if-then plan that links an environmental cue to a specific response. For example, when taking the IAT a person made decide, “When I see a Black face, I will think the word ‘good.’” This has been found to be effective in the short term.

Change-based interventions

One approach to changing people’s associations is to show them photos of well-known Black people with positive associations, i.e., Martin Luther King, and well-known White people with negative associations, i.e., Charles Manson. Another approach is to have participants read a vivid story in which they imagine they are beaten up by a White assailant and rescued by a Black man. These techniques have also been shown to work in the short term.

Calvin Lai and his associates organized a tournament in which social psychologists suggested 17 simple interventions to reduce implicit bias. Lai found that nine of them were effective in reducing automatic preference for Whites. However, in a followup study, he found that none of the nine strategies had effects that lasted as long as 2-4 days. These strategies produce elastic changes. After a short time, the level of implicit bias snaps back to what it was before.

Patricia Devine and colleagues did an 8-week study called “Breaking the prejudice habit.” She says overcoming prejudice is a lengthy process that requires considerable effort. People must be aware of their biases and concerned about their negative consequences. Her approach involves teaching people to recognize situations that activate biases and replacing biased responses with unbiased ones.

On week 1, college student participants took the IAT and received feedback on their scores. At this point, members of the control group were dismissed. The experimental group then received a 45 min interactive training session, with a slide show and discussion. They were given a five strategy “toolkit” for breaking the prejudice habit.

  1. Stereotype replacement: You regognize that you are responding to a person or situation in stereotypical fashion, consider the reasons, and replace the biased response with an unbiased one.
  2. Counter-stereotypic imagining: Think of examples—either famous or personally known—that show the stereotype to be inaccurate.
  3. Individuating: Gather information about the minority person’s background, family, hobbies, etc., so that judgments will be based on these particulars rather than group membership.
  4. Perspective-taking: Imagine yourself in the situation of the stereotyped person, i.e., what does it feel like to be trailed by detectives every time you enter a department store?
  5. Increasing opportunity for positive contact: Actively seek out situations in which you will have positive contact with Blacks, either through personal interaction or (carefully) the media.

The test group was recontacted twice to report on how they were doing. Both the test group and the control retook the IAT at Week 4 and Week 8. The test group showed a significant reduction in automatic preference for White while the control group did not. A replication with a larger sample confirmed this result. This time the control group also showed improvement (presumably an effect of retaking the IAT), but not as much as the experimental group. A two-year followup of some of the participants showed that members of the test group were more willing than members of the control group to publicly criticize a racist essay allegedly written by a fellow student.

Problem: These kinds of training programs often show strong placebo effects, in which the expectation that the program will help the participants produces positive changes even when the program itself is ineffective.

Interracial contact

There is reason to believe that the strongest item in Devine’s “toolkit” is interracial contact. In the longest study of implicit bias I know, 262 White college freshmen were randomly assigned either a Black or a White roommate. Implicit bias was measured at the beginning and end of the semester. The students who had a Black roommate reported less satisfaction with their living arrangement and less involvement with their roommates than those with a White roommate. However, their implicit bias declined significantly from beginning to end of the semester, while the implicit bias of those with White roommates was unchanged.

There is a long tradition of research in social psychology showing that interracial contact, under the right conditions, can reduce prejudice. The most important condition is that the group members must be able to cooperate rather than compete with one another. These studies, however, were not intended to address implicit bias.

When representatives of profit-making corporations suggest that their (usually brief) training program that will reduce implicit bias in police officers, or anyone else, at the present time I recommend skepticism.

A reframing

When we receive feedback indicating that we have an automatic preference for White, we could (optimistically) interpreted this as in-group favoritism rather than racial prejudice. This could be an invitation to re-examine some of our habits, such as with whom we socialize, and our patterns of volunteering and charitable giving, for evidence of in-group favoritism.