Nortraships office in London, Tanker Departement 1944 (public domain)

Implicit Bias | Part 3: Workplace Bias

Think about decisions that people make every day. A committee decides who to hire. A supervisor rates an employee’s performance. A teacher grades a student’s assignment. A jury arrives at a verdict. A Supreme Court judge casts their vote. An emergency medical technician decides which victim to approach first. A police officer decides whether to shoot. These are instances in which workplace bias can have significant consequences.

I won’t be able to highlight every area of research on workplace bias. So I cannot delve into the findings that police officers’ sometimes show racial bias in decisions to shoot (Sim, Correll, and Sadler 2013, Experiment 2; see Correll et al 2007, Ma and Correll 2011 Study 2 for findings that indicate no racial bias). And I cannot go into detail about how all-white juries are significantly more likely than other juries to convict black defendants (Anwar, Bayer, Hjalmarsson 2012).

GENDER BIAS AT WORK

Instead, I’ll focus on the instances of workplace bias to which most people can relate. If you’re like most people, then you need to work to live, right? So let’s talk about how bias can affect our chances of being hired.

Researchers presented participants with résumés representing male and female job applicants. Subjects could easily determine the gender by the names on the résumé — “Joan McKay” vs. “John Smith.” Some résumés portrayed stereotypically male traits — e.g., “achievement-oriented” or “power-seeking.” Interestingly, men’s judgments about these résumés changed depending on whether the résumé belonged to a male or female. Specifically, men judged female applicants with stereotypically male traits to be less likable, less competent, and less socially adept than male applicants with the same traits (Gill 2004; Tyler and McCullough 2009 [PDF]). Crazy, right? Men judged job applicants differently based not on credentials or traits, but on gender!

BEING OBJECTIVE MIGHT NOT HELP

You might be thinking, “Maybe people in these studies weren’t taking the task seriously. They’d make better decisions if they were on an actual hiring committee.” The idea here is that in the real-world people take hiring decisions seriously, so they’ll overcome their biases. Mmmm…I don’t know about that.

In some studies, participants were instructed to imagine that they were serving on actual hiring committees and they made more egalitarian judgments (Williams and Ceci 2015 [PDF]). But other studies primed participants to be objective and this only increased gender discrimination (Uhlmann and Cohen 2007 [PDF]). So even if we try to be more objective when making real-world decisions, it’s not clear that this would eliminate workplace bias.

BEING SMART MIGHT NOT HELP

Again, someone might still want to object. “Here’s the problem: the research participants! They’re not smart or objective enough! If we test smart, objective people, then we’ll see that they overcome their biases!” That sounds intuitively plausible. Alas, the evidence seems to falsify the intuition.

You probably think that scientists are smarter and more objective (Veldkamp et al 2017).  But guess what: even scientists exhibit workplace bias.

Corinne A. Moss-Racusin and colleagues found that science professors were biased about their evaluation of applications for prospective lab managers. Application materials were from either males or females and — here’s the key — application materials were identical for male and female applicants, except for the gender-linked names of the applicants. Based on these application materials, both male and female science professors thought the male applicants were significantly more competent and hireable than the female applicants (Moss-Racusin, Dovidio, Brescoll, Graham, and Handelsman 2012 [PDF]) … and it gets worse: Both male and female professors offered male applicants higher starting salaries and more mentoring opportunities than they offered female applicants (ibid.) — even though the males’ and females’ application materials were identical! Yikes!

But application materials are not always identical. So, what about real-world application materials? Well, the evidence is still damning. Consider an example from real-world applications for assistant professorships. Both male and female recommendation letter-writers raised more doubts about the women that they recommended than the men that they recommended (Madera et al 2018).

This suggests that even presumably smart, objective people fail to overcome workplace bias — even about the people that they want to hire and recommend.†

BIAS IN ACADEMIA

Apparently academics in certain fields tend to believe, to varying degrees, that only innately brilliant people can succeed in their field. And Sarah-Jane Leslie and colleagues found that the academic fields that prize brilliance the most have the lowest representation of women and black people (Leslie et al 2015 [PDF]). The suggestion is that some areas of academia might stereotype women and black people as lacking some sort of intelligence.

Philosophy fail. It turns out that philosophers emphasize the importance of innate brilliance more than any of their group of academics (ibid.). And sure enough, women — and other groups — are drastically underrepresented in philosophy: only 18.7% of professional philosophers were females in 2006 — 25.4% were females in 2009 (Buckwalter and Stich 2014 [PDF]). There are various proposals for cause(s) of this underrepresentation of women in philosophy (Adleberg, Thompson, and Nahmias 2014 [PDF]; Baron, Dougherty, and Miller 2015 [open access]; Thomson, Adleberg, Sims, and Nahmias 2016 (open access)), but it would not be unsurprising to find that bias is one of them (Saul 2013 [PDF]).

DISAGREEMENT ABOUT WORKPLACE BIAS

Not everyone agrees about the role of bias in underrepresentation. For instance, Stephen Ceci and colleagues argue that bias doesn’t account for the continued underrepresentation of women in science (Ceci et al 2014 [PDF]). More recently, Wendy Williams and Stephen Ceci conducted a study suggesting that hiring committees might actually prefer women to men in science, technology, engineering, and mathematics (STEM) (Williams and Ceci 2015 [PDF]). These results might not be beyond reproach, however. For example, sociologist Zuleyka Zevallos [Other Sociologist], philosopher Michael Brownstein [Feminist Philosophers], and professors Joan C. Williams and Jessi L. Smith [Chronicle of Higher Ed.] express multiple concerns about Williams and Ceci’s 2015 study.

There is certainly some reason to think that bias plays a role in our everyday judgments and there is certainly evidence that certain groups are underrepresented in many settings. And there seems to be a relationship between biases and underrepresentation — even persistent defenders of Ceci and Williams’ studies admit that much [Quartz]. But questions about precisely when and how bias influences representation is by no means settled.

CONCLUSION

Let’s take stock. After reviewing a bit more of the implicit bias research, we found that.

  1. Certain people are grossly underrepresented in certain fields, but researchers disagree about whether and how bias produces that underrepresentation.
  2. Peoples’ judgments about one and the same resume changed on the basis of perceived gender.
  3. Even presumably smart people seem to be implicitly biased toward people that they think should be hired.
  4. Attempting to be objective did not seem to completely prevent implicitly biased behavior.

Series Outline

Part 1: I talk about the anxiety I had about being implicitly biased. “Am I discriminating without even knowing it?! How do I stop?!”

Part 2: What is implicit bias? Check out the theory and some of the interesting findings.

Part 3: [Jump to the top]

Part 4: What can we do about implicit bias? The research on de-biasing suggests a few strategies for limiting biased judgments and behavior. Spoiler: many people and institutions aren’t using these strategies.

Part 5: How can we address each others’ biases? It’s time for Feedback 101, featuring tips for developing a culture of helpful feedback and some examples of how to give feedback to one another.

 


† Honestly, I don’t think that scientists are necessarily smarter than anyone else. Sure, they might have certain tools and habits that allow them to reason in more desirable ways. But I imagine that pretty much anyone could learn use these tools and cultivate these habits.

Image: “Nortraships office in London, Tanker Departement 1944” (Public domain), via Wikipedia

Published by

Nick Byrd

Nick is a cognitive scientist at Florida State University studying reasoning, wellbeing, and willpower. Check out his blog at byrdnick.com/blog