At this point it’s pretty clear why someone would be worried. We’re biased (Part 1). We don’t have total control over our bias (Part 2). And our bias seems to tamper with significant, real-world decisions (Part 3). So now that we’re good and scared, let’s think about what we can do! It’s time to talk about debiasing!
In the last post, we learned that implicit attitudes and stereotypes can badly affect our judgments. One way to cultivate implicit attitudes and stereotypes is conditioning: repeatedly present someone with a pair of stimuli until they begin to associate one thing with the other (De Houwer, Thomas, and Baeyens 2001; Hofmann, De Houwer, Perugini, Baeyens, and Crombez 2010). So, for example, if someone consumes media that repeatedly presents certain ideas or groups in a negative light, then they will cultivate a negative implicit attitude of these ideas or groups (Arendt 2013; Arendt & Northup 2015; Matthes & Schmuk 2015). Or if a certain profession is dominated by white men, then people will associate membership in this profession with being white and male — and this might have a self-reinforcing effect on the profession’s makeup.
It turns out that we can use this mechanism of conditioning against our biases. It’s called counterconditioning and it could be a useful debiasing tool (Stewart, Latu, Kawakami, and Myers 2010; VansteenWagen, Baeyens, and Hermans 2015). One fairly easy and effective way to countercondition a negative stereotype is just to imagine a positive version of a negatively valence concept (Blair, Ma, and Lenton 2001, Markland et al 2015). You can also use counterconditioning to weaken stereotypes and promote inclusivity. For example, if your area of work is dominated by, say white men, then you have an opportunity to countercondition the stereotype of your profession every time you nominate, recommend, or choose someone for an opportunity at work. When you are thinking about who is eligible for these opportunities, go out of your way to think of candidates from underrepresented groups. And don’t do it quickly or last-minute. Make time for thinking outside the stereotype. If you’re a teacher, then assign material from underrepresented groups. If you’re an office manager, then choose imagery for the office that represents a diverse group of people.
CHANGING THE CIRCUMSTANCES
It turns out that altering the decision environment can also support debiasing. So if you have a say in the decision environment, then think about the following. First, make sure decision-makers have a stake in making good decisions. If decision-makers don’t have a salient stake in the quality of their decisions, then put decision-makers in a position that forces them to justify their decisions to a third-party (Lerner & Tetlock 1999, Simonson & Nye 1992). Second, try to diversify the group of decision-makers (Bell, Villado, Lukasik, Belau, and Briggs 2011; Shor, Rijt, Miltsov, Kulkarni, and Skiena 2015). Third, make the important data more easy to consume. Remove distracting information from the data, e.g., a job candidate’s identity, the status quo, unnecessary complexity, etc. (Célérier & Vallée 2014; Thaler & Sunstein 2008). Present the remaining data in a way that allows for easy comparison (Heath & Heath, 2010; Hsee 1996, Russo 1977). And present the data in the most relevant scale (Camilleri & Larrick 2014; Burson, Larrick, & Lynch, 2009; Larrick and Soll 2008). Talk about probabilities in terms of relative frequencies or, better yet, represent probabilities with visualizations [examples] (Fagerlin, Wange, and Ubel 2005; Galesic, Garcia- Retamero, & Gigerenzer 2009; Hoffrage, Lindsey, Hertwig, and Gigerenzer 2000).
Certain reasoning strategies can help you with debiasing (e.g., Larrick 2004; Fong and Nisbett 1991, Larrick, Morgan, and Nisbett 1990). First, use decision-procedures checklists, criteria, and rubrics — especially if you are making one kind of judgment repeatedly, e.g., grading, reviewing applications, etc. (Gawande 2010; Heath, Larrick, and Klayman 1998; Hales & Pronovost 2006). Whenever possible, use quantitative models as a guide, e.g., predictive linear models (Dawes 1979 and Dawes, Faust, & Meehl, 1989; Bishop & Trout 2008). And be wary of deviating from the decision procedure. Doing so might just create an opportunity for bias to influence your judgment.
Finally, take advantage of people’s bias toward the status quo. When choosing between multiple options: if it is already well-known that one option leads to the best outcome, then default to the best option (Chapman, Li, Colby, and Yoon 2010; Johnson, Bellman, and Lohse 2002; Johnson & Goldstein 2003; Madrian & Shea 2001).
Procedural reasoning is not fail-safe. “Training in normative rules often fails when people have strong intuitions and do not pause to think more deeply (McKenzie and Liersch 2011)” (Soll, Milkman, and Payne 2014). So force yourself to think more deeply. Put yourself in situations where your intuitions will be challenged. Partner with people who disagree with you so that they can easily generate arguments/evidence against your intuitions (Koriat, Lichtenstein & Fischhoff 1980). If you can’t find someone to play devil’s advocate then play the role yourself. Think of other arguments and conclusions (Herzog and Hertwig 2009, 2014; Keeney 2012). The point of all this is to prevent overconfidence in your intuition, e.g., optimism bias (Ben-David, Graham, and Harvey 2013, Moore & Healy 2008).
As you can see, there are many debiasing tools in our toolkit: conditioning, circumstances, procedures, and skepticism. Admittedly, there might be times when we use these tools and it bias rears its head anyway. In these situations there is yet another way address the bias: feedback. That will be the topic of Part 5.
Part 1: I talk about the anxiety I had about being implicitly biased. “Am I discriminating without even knowing it?! How do I stop?!”
Part 2: What is implicit bias? Check out this post to learn about the theory and some of the findings.
Part 3: What are the real-world consequences of implicit bias? Research suggests implicit bias can impact hiring, pay, promotions, and more.
Part 4: [Jump to the top]
Part 5: Can feedback help? What if people are not used to giving each other feedback about their biases. Check out how to give and encourage feedback about our biases.
Featured image: “President Lyndon B. Johnson signs the 1964 Civil Rights Act as Martin Luther King, Jr., and others, look on” (Public Domain) via Wikipedia Commons [source]