"Education [in part]" (1890) by Ragesoss via Wikipedia [public domain]

Philosophers’ Reasoning Errors and Their Beliefs: Related?

“Philosophers’ reasoning is generally better than non-philosophers’.”

The evidence for this claim is hard to dispute. For example, taking courses in analytic philosophy and argument mapping does more for students’ critical thinking than even critical thinking courses do (Ortiz 2007). Further, the more training one has in philosophy, the better one does on certain reasoning tasks (Livengood et al 2010). Finally, philosophy majors tend to outperform almost every other major on the GRE, the GMAT, and the LSAT (“Why Study Philosophy…”; see also Educational Testing Service 2014). In light of this evidence, it’s no wonder that researchers like Deanna Kuhn have such high praise for philosophers’ reasoning (The Skills of Argument 1991, 258-262).†

Reasoning expertise: We turn now to the philosophers…. The performance of the philosophers is not included in table form because it is so easily summarized. No variation occurs…philosophers [show] perfect performance in generation of genuine evidence, alternative theories, counterarguments, and rebuttals…. The philosophers display a sophisticated understanding of argumentative structure…. None of the philosophers [had] any special expertise in any of the content domains that the questions address…. The performance of philosophers shows that it is possible to attain expertise in the reasoning process itself, independent of any particular content to which the reasoning is applied.

In this post I will present some research on philosophers’ reasoning and answer two questions.

TWO QUESTIONS

First, it’s one thing to claim that philosophers are better reasoners, but that’s not the same as being perfect reasoners. After all, philosophers might reason better than others and yet still be vulnerable to systematic reasoning errors. So we need to ask: Is philosophers’ reasoning susceptible to systematic error?

Second, if philosophers do err systematically, then we might worry that philosophers arrive at their views in systematically erroneous ways. One way to address this worry is to ask a second question: Is there a relationship between philosophers’ systematic errors and their views?

Spoiler: The answer to both questions is yes. More detailed answers are in the rest of this post. Further details can be found in Byrd 2014 and other papers. 

1. Is philosophers’ reasoning susceptible to systematic error?

In order to understand the rest of the post, you will need to answer the question below. It should only take a moment.

A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

The question comes from the Cognitive Reflection Test (CRT) (Frederick 2005). It is designed to elicit a quick answer. What answer first came to your mind?

If you are like most people, one answer quickly came to mind: “10 cents.” And if you are like many people, you had an intuitive sense that this answer was correct. Alas, 10 cents is not correct. You can work out the correct answer on your own if you like. The point I want to make is this: the intuitively correct answer to this question is demonstrably false. This suggests that answering this question intuitively constitutes an error in reasoning. 

It turns out that philosophers are less likely than others to make this error.

Jonathan Livengood and colleagues found that the more philosophical training one had, the less likely one was to make this error (Livengood et al 2010). I replicated this finding a few years later (Byrd 2014). Specifically, I found that people who had — or were candidates for — a PhD in philosophy were significantly less likely than others to make this reasoning error — F(1, 558) = 15.41, p < 0.001, d = 0.32 (ibid.).

Some philosophers performed perfectly on the CRT — even after controlling for whether philosophers were familiar with the CRT. However, many philosophers did not perform perfectly. Many philosophers made the error of responding intuitively on one or two of the CRT questions. This implies an answer to our first question.

Answer: Yes. Philosophers’ reasoning is susceptible to systematic error.

So what about our second question?

2. Is there a relationship between philosophers’ systematic errors and their views?

Among lay reasoners, the tendency to make this reasoning error on the CRT correlates with and even primes various theistic beliefs — e.g., the belief that God exists, that immortal souls exist, that life experiences can count as evidence that a god exists, etc. (Shenhav Rand and Greene 2012). This finding is in line with a common theme in the research on reasoning: quick and intuitive reasoning predicts a whole bunch of religious, supernatural, and paranormal beliefs (Aarnio and Lindeman 2005; Bouvet and Bonnefon 2015; Giannotti et al 2001, Pennycook et al 2012, Pennycook et al 2013, Pennycook et al 2014a, 2014b).

And this finding has been replicated among philosophers. Specifically, philosophers who were more likely to make a reasoning error on the CRT were significantly more likely to lean towards or accept theismF(1, 559) = 7.3, p < 0.01, d = 0.16, b = 0.12 (Byrd 2014).

There is also evidence that people who make the intuitive error on the CRT are more prone to certain moral judgments. To see what I mean, read the scenario below (Foot 1967).

You see a trolley racing down its track towards five people. You happen to be standing near the switch that would divert the trolley down a sidetrack toward one person. If you pull the switch the trolley will surely kill 1 person. If you do not pull the switch the trolley will surely kill five persons. Do you pull the switch?

So? Would you pull the switch or not? If you answered intuitively on the CRT question, then you might be less likely to pull the switch (Paxton, Ungar, and Greene 2012).

Once again, it turns out that this finding holds among philosophers as well. Philosophers who were more likely to make a reasoning error on the CRT were significantly less likely to pull the switch — F(1, 559) = 6.93, p < 0.001, d = 0.15, b = 0.17 (Byrd 2014).

Philosophers’ proclivity to make this error was also positively associated with other philosophical views:

  • Physical (as opposed to psychological) views of personal identity — F(1, 558) = 8.57, p < 0.001, d = 0.17.
  • Fregeanism (as opposed to Russelianism) about language — F(1, 558) = 8.59, p < 0.01, d = 0.17.

I have lots of thoughts about these findings, but I want to keep things brief. For now, consider the implied answer to our second question.

Answer: Yes. Philosophers’ reasoning errors are related to their views.

CONCLUSION

So there you have it. It would seem that philosophers are susceptible to systematic reasoning errors. And insofar as philosophers are so susceptible, they tend toward certain views. I’m tempted to say more, but I’ve already done so elsewhere (Byrd 2014); so have others.†† I’m hoping enough feathers are still ruffled to spark some good conversation.

[Jump to Comments]

 


† Thanks to Greg Ray for pointing me to this passage.

†† First, I’ve offered only select evidence that philosophers’ reasoning is priveleged. (A) What does the rest of the literature suggest about philosophers’ reasoning? Unsurprisingly, the verdict is disputed (Nado 2014, Machery 2015, Mizrahi 2015, Rini 2015). Indeed, in some contexts, philosophers don’t seem to reason any better than anyone else (Schwitzgebel and Cushman 2015; Pinillos et al 2011). And second, even if philosophers are better reasoners, it’s not even clear why they are better (Clarke 2013). (B) Why would philosophers be better reasoners than others? I sketch an account in Byrd 2014, Section 3 (see also Weinberg, Gonnerman, Buckner, and Alexander 2010). Finally, if philosophers really are better reasoners, then this might have interesting implications about public discourse. (C) Should non-philosophers defer to the judgments of philosophers? (D) Should philosophers’ reasoning methods be the gold standard? I’m happy to discuss this in the comments.


 

 

References

Aarnio, K., & Lindeman, M. (2005). Paranormal beliefs, education, and thinking styles. Personality and Individual Differences, 39(7), 1227–1236. [PDF] [HTML]

Alvarez-Ortiz, C. (2007). Does philosophy improve critical thinking skills? Unpublished Thesis. The University of Melbourne. [PDF]

Bouvet, R., & Bonnefon, J.-F. (2015). Non-Reflective Thinkers Are Predisposed to Attribute Supernatural Causation to Uncanny Experiences. Personality and Social Psychology Bulletin, 0146167215585728. [HTML]

Byrd, N. (2014). Intuitive And Reflective Responses In Philosophy. University of Colorado. [PDF]

Clarke, S. (2013). Intuitions as Evidence, Philosophical Expertise and the Developmental Challenge. Philosophical Papers, 42(2), 175–207. [HTML]

Frederick, S. (2005). Cognitive Reflection and Decision Making. Journal of Economic Perspectives, 19(4), 25–42. [PDF] [HTML]

Educational Testing Service (2014) “General Test Percentage Distribution of Scores Within Intended Broad Graduate Major Field Based on Seniors and Nonenrolled College Graduates”, Table 4. [PDF]

Foot, P. (1967). “The problem of abortion and the doctrine of the double effect.” Oxford Review (5), 1967. [PDF]

Gianotti, L. R., Mohr, C., Pizzagalli, D., Lehmann, D., & Brugger, P. (2001). Associative processing and paranormal belief. Psychiatry and Clinical Neurosciences, 55(6), 595–603. [HTML]

Kuhn, D. (1991). The Skills of Argument. Cambridge University Press. [Amazon]

Livengood, J., Sytsma, J., Feltz, A., Scheines, R., & Machery, E. (2010). Philosophical temperament. Philosophical Psychology, 23(3), 313–330. [PDF] [HTML]

Machery, E. (2015). The illusion of expertise. In E. Fischer & J. Collins (Eds.), Experimental Philosophy, Rationalism, and Naturalism: Rethinking Philosophical Method (p. 188). Routledge. [Amazon]

Mizrahi, M. (2015). Three Arguments Against the Expertise Defense. Metaphilosophy, 46(1), 52–64. [PDF] [HTML]

Nado, J. (2014). Philosophical Expertise. Philosophy Compass, 9(9), 631–641. [PDF] [HTML]

Nieswiadomy, Michael. LSAT Scores of Economics Majors: The 2008-2009 Class Update. SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, June 25, 2009. [PDF] [HTML]

Pennycook, G. (2014). Evidence that analytic cognitive style influences religious belief: Comment on Razmyar and Reeve (2013). Intelligence, 43, 21–26. [PDF] [HTML]

Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2014). Cognitive style and religiosity: The role of conflict detection. Memory & Cognition, 42(1), 1–10. [PDF] [HTML]

Pennycook, G., Cheyne, J. A., Koehler, D. J., & Fugelsang, J. A. (2013). Belief bias during reasoning among religious believers and skeptics. Psychonomic Bulletin & Review, 20(4), 806–811. [PDF] [HTML]

Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J., & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123(3), 335–346. [PDF] [HTML]

Pinillos, N. Á., Smith, N., Nair, G. S., Marchetto, P., & Mun, C. (2011). Philosophy’s New Challenge: Experiments and Intentional Action. Mind & Language, 26(1), 115–139. [PDF] [HTML]

Rini, R. A. (2015). How not to test for philosophical expertise. Synthese, 192(2), 431–452. [PDF] [HTML]

Schwitzgebel, E., & Cushman, F. (2015). Philosophers’ biased judgments persist despite training, expertise and reflection. Cognition, 141, 127–137. [PDF] [HTML]

Shenhav, A., Rand, D. G., & Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in God. Journal of Experimental Psychology: General, 141(3), 423. [PDF] [PsycNET]

Weinberg, J. M., Gonnerman, C., Buckner, C., & Alexander, J. (2010). Are philosophers expert intuiters? Philosophical Psychology, 23(3), 331–355. [HTML]

Why Study Philosophy? An Unofficial “Daily Nous” Affiliate: Charts & Graphs. (n.d.). Retrieved November 14, 2015, from [URL]

 

(Image: “Education (center)” (1890) by Louis Comfort Tiffany, public domain)

Published by

Nick Byrd

Nick is a cognitive scientist studying reasoning, wellbeing, and willpower. When he is not teaching, in the lab, writing, exercising, or relaxing, he is blogging at www.byrdnick.com/blog

7 thoughts on “Philosophers’ Reasoning Errors and Their Beliefs: Related?”

  1. Hi Nick, nice post. I’m curious… By “intuitively” do you mean answering a question before fully examining the question in a rational manner?

    1. Good question! For the intents and purposes pf this post, ‘intuitively’ refers to what naturally comes to mind prior to expending mental effort and/or prior to reflection. I don’t mean to import anything about rationality into the definition.

  2. Oh I see… The awareness that overrides the immediate intuition doesn’t necessarily have to be rational in nature, but it has to elicit some type of reflection and/or mental effort.

    The first question is a real tease. My friend had convinced himself the answer was 10cents. After I explained the correct answer to him, he went through a brief period of resolute ‘dumbfoundedness’ before realizing, “Oh yeah…”

    1. Right. At least, that’s the way I’m thinking of things in this post.

      Interesting story about your friend. I’m often surprised at how powerful my sense of being correct can be. It can take months or years of thinking carefully about something for me to realize that a deeply held intuition doesn’t hold up to scrutiny. When we’re talking about simple math, it’s easy for us (or a friend) to check our work and reveal the error. It’s not so easy when the question is more complex (e.g., in philosophy, politics, economics, social science, etc.). In a later post about Christian apologetics, I’ll say more about this.

      Thanks for weighing in. Always a pleasure!

  3. I hear you on that! Being a philosopher is a big responsibility. You have to constantly open yourself up to new ideas that often attack your own personal intuitions. Unfortunately it seems that we live in society where fast answers are valued higher than quality answers; if someone is taking the time to “think” about their answer, then they are automatically labeled as slow or ignorant.

    Ever notice on the news (cnn, fox, etc) how the guests on a debate panel never have time to think about their response? They either talk at 100mph to say what they had probably already rehearsed, or if they actually stop to think, they get interrupted by the host. It’d be something to see some ‘dumbfounding’ actually happen on live tv? Then maybe I’d watch it!

    I’d like to think that TIME is necessary if you want to express/(understand for yourself) a novel idea and if that time is not allowed, then the best you can do is recycle your own intuitions.

    Anyway, I won’t rant any longer. I look forward to your next post!

    1. The news example is an interesting one, since it highlights one way in which reflection can lead to quick answers that aren’t so faulty. That is, we can reflect on something frequently, course-correct as needed, and thereby solidify our view about something such that when asked about it, we can answer quickly. But this quick response might not be of the same nature as the quick responses on the CRT (because it is, at least in part, the result of effortful and careful reflection). Experimental psychologists have recently realized that this distinction between quick responses could be significant, so they are recruiting better methods to differentiate between responses (e.g, process dissociation (Jacoby 1991 [PDF], Conway and Gawronski 2013 [PsycNET]).

      Anyway, I’m glad you brought this up. You’re doing well to reveal the nuances in this research. Nice work!

  4. Thanks for the explanation. I may have strayed off topic a little on my last post. I’ll check out the article on process dissociation.

Comments are closed.