You know how I do. When people make strong claims, I want evidence and arguments. So this US presidential campaign was a lot of work. A lot! (E.g., I read over 1000 pages about Clinton-related investigations alone). The problem is that people made loads of unsupported claims during the election. So I asked for loads of evidence. Curiously, people didn’t take kindly to my requests for evidence. As a reasoning researcher, this was fascinating. But as an aspiring reasoning teacher, it was thoroughly demoralizing. In this post, I’ll discuss my experience, some research that bears on my experience, and my concerns about post-fact reasoning.
1. Too Many Unsupported Claims
First of all, there were way too many unsupported claims in my midst. I just didn’t have the time to follow-up on all of them.
People who know me might be surprised by this. After all, I interact with very few people. I go to work. And I go back home. I rarely “hang out”, as they say. So it’s not like I was canvassing local neighborhoods or searching the internet looking for unsupported claims. I was just going about my day. But in talking to a few friends/family, checking my online social network, running into people around town, etc., I was inundated with more unsupported claims than I could address.
2. People’s Responses Were Disappointing
When I asked for evidence, I expected people to give some evidence — even if it wasn’t always good evidence. But I was wrong. Usually, people didn’t even try to give evidence. And when they did give evidence, it was bad evidence. Looking back, people responded in one of four ways.
2.1 Change the subject
One way people responded was by changing the subject. So, for instance, when I asked for evidence for a damning claim about Hilary Clinton, they’d start talking about something else:
“Look, Trump tells it like it is. So […]”.
2.2 Further unsupported claims
Another way people would respond is with further unsupported claims. For example, when I asked for evidence for the damning claim about Hilary Clinton, people would say,
She’s got blood on her hands!
She’s just not a good person!
She’s got character issues.
These claims, of course, require (a lot) more evidence, so now I had to ask for even more evidence.
2.3 Bad evidence
Sometimes people did give evidence — albeit bad evidence. For instance, when I asked an old co-worker why they thought that Hilary Clinton had bad character, they replied,
I watched the Benghazi hearings. I could see it on her face.
That was it. That’s all they could muster. (Aside: think about the fact that this person can serve on a jury.)
A third response was some sort of attack.
You’re just a blind leftist.†
Let me guess, you also support baby killing.††
3. What Does This Mean?
Given my experience, I’m entirely unsurprised that some people call ours a “post-fact era” and some people call certain voters “irredeemable.” After all, the only kind of evidence that people seemed to appreciate was the evidence that confirmed the view they had before they started looking for evidence — #confirmationBias. And if people are only interested in confirming their own view of the world, then it would seem that they care more about feeling correct than they do about the facts or about “the common good” (Brennan 2012).
4. The Problem: It Was Never About Facts
Think about it. There are very careful, detailed, free, and easy-to-find reports produced by massive professional investigations and large-scale fact-checking operations out there.
Reports like this bear directly on many peoples’ unsupported claims — e.g., peoples’ unsupported claims Hilary Clinton’s email use.
For example, the Inspector General’s report on email security and management found that Republican Secretary Powell used a private line to send official emails from a personal email address. It also found that Republican Secretary Rice (like Powell and Clinton) failed to follow State Department separation procedures regarding email. Oh, and it found that 90 of Powell’s/Rice’s immediate staff used personal emails for official business. When I mention this report’s findings around some people, they flat-out reject the reports! (And for fallacious reasons.).:
The [report/institution] is [liberal, government, mainstream, establishment, elitist, etc.]. I don’t trust that stuff.
Notice the mental shortcuts these people use to systematically dismiss certain evidence. They’re not dismissing evidence based on careful analysis. They’re dismissing it based solely on its association with a group that they don’t like. So their political claims aren’t even intended to be about facts; they’re about identity.
It’s like spectator sports: spectators boo when their favored team receives a foul — even if the foul is indisputable.
Related post: “The Bias Fallacy”
5. Is This Normal?
In short, kind of. Three sorts of empirical findings affirm this.
First, people systematically ignore relevant evidence (Kahneman 2013, Chapter 13). And even when people are presented with evidence, they are often biased in their assimilation of it — #biasedAssimilation (Lord et al 1979). For instance, when people are presented with counter-evidence, they systematically prefer the evidence that confirms their own view (Corner et al 2012; Hart and Nisbet 2012). That or they discount the counter-evidence (ibid.). And sometimes people actually strengthen their view when presented with counter-evidence — #backfireEffect (Nyhan and Reifler 2010).
More than that, people do not trust perceived political opponents — and it’s getting worse (Hetherington & Rudolf 2014, 2015).
And since President Johnson, trust in government is way down, in general (ibid.). More specifically, most Republicans report that they “never” trust the government (vs. a minority of Democrats that report the same thing) (Hetherington & Rudolf 2015, italics added).
So when the Inspector General’s report about email conflicts with Republicans’ claims about Hillary Clinton, then it might be normal for Republicans to flat-out reject the report — and not because they’re Republican, but because they’re human. (NB: normal ≠ good, acceptable, etc.).
6. What now?
Honestly, I’m not sure. And to be honest, dealing with peoples’ responses to my requests for evidence is exhausting.
Part of my exhaustion comes from having the wrong expectations. Looking back I (foolishly) expected people to care about evidence and to update their arguments and conclusions according to evidence (and logical norms). But apparently people don’t always do that.
Indeed, sometimes people don’t even have arguments! They just have the conclusions! #beliefBias
And sometimes peoples’ mission is not some sort of open-minded inquiry into the best arguments and evidence. Rather, their mission is to uphold their own group’s views and/or intentions (Mercier 2012; Mercier & Sperber 2011; Norman 2016; Sperber & Wilson 2001).
But when people reason this way, there might be no amount of counter evidence that can update their view.
Their reasoning might be immune to facts.
So they might be irredeemable.
I hope that both of these hypotheses turn out to be wrong.
† (1) Answer: no. I’ve been registered as “No Party Affiliation” for at least 5 years — switching only when presidential primaries require a strategic change in affiliation. Before that, I was registered as a Republican. And so far, I’ve voted for a Republican, a Democratic, and a third-party presidential candidate. (2) If asking for evidence is a “leftist” thing to do, then that seems like a point for leftists, no?
†† Answer: no.
Brennan, J. (2012). The Ethics of Voting. Princeton: Princeton University Press.
Corner, A., Whitmarsh, L., & Xenias, D. (2012). Uncertainty, scepticism and attitudes towards climate change: biased assimilation and attitude polarisation. Climatic Change, 114(3–4), 463–478. [Free PDF]
Hart, P. S., & Nisbet, E. C. (2012). Boomerang Effects in Science Communication How Motivated Reasoning and Identity Cues Amplify Opinion Polarization About Climate Mitigation Policies. Communication Research, 39(6), 701–723. [Free PDF]
Haidt, J., & Hetherington, M. J. (2012). Look How Far We’ve Come Apart. The New York Times.
Hetherington, M. J., Long, M. T., & Rudolph, T. J. (2016). Revisiting the Myth New Evidence of a Polarized Electorate. Public Opinion Quarterly, 80(S1), 321–350.
Hetherington, M. J., & Rudolph, T. J. (2015). Why Washington Won’t Work: Polarization, Political Trust, and the Governing Crisis. Chicago: University Of Chicago Press.
Kahneman, D. (2013). Thinking, Fast and Slow (1st edition). New York: Farrar, Straus and Giroux.
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109. [Free PDF]
Mercier, H. (2012). Some Clarifications about the Argumentative Theory of Reasoning. A Reply to Santibáñez Yañez (2012). Informal Logic, 32(2), 259–268. [Free PDF]
Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34(2), 57–74. [Free PDF]
Norman, A. (2016). Why we reason: intention-alignment and the genesis of human rationality. Biology & Philosophy, 1–20. [Free PDF]
Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32(2), 303–330. [Free PDF]
Sperber, D., & Wilson, D. (2001). Relevance: communication and cognition. Oxford; Cambridge, MA: Blackwell Publishers.