APA Interview & Deleted Interview Questions

I recently answered some questions from Skye Cleary, managing editor of the APA blog. Some of the interview questions were really fun. In fact, I ended up going over the word limit. So I had to delete some things. But if you’re interested in the deleted interview questions, then you can find some of them below. The main interview is here: “APA Member Interview: Nick Byrd“.

The Deleted Interview Questions

What time of day are you most productive and creative?

My mind is at it’s best in the first 6-7 hours of my day. When I’m smart, I use those hours to accomplish the most demanding and important things on my to-do list. But when I’m foolish, I waste that time on mindless and/or unimportant work.

What do you like to do outside work?

Hmm. It varies:

What is your favorite quote?

“You cannot escape philosophy.”

I’m guessing that this has been uttered many times by many people. The last time I saw this line was when I read “Neuroscience Needs Behavior: Correcting a Reductionist Bias” in Neuron. Here’s a free copy of the paper.

What is your least favorite type of fruit and why?

Mango. That smell. Yuck. I feel sick just thinking about it.

What would you like your last meal to be?

Whatever makes my body most useful to science.

The Bias Fallacy

“They’re biased, so they’re wrong!” That’s a fallacy. Call it the bias fallacy. Here’s why it’s a fallacy: being biased doesn’t entail that everything one does is wrong. So when someone jumps from the observation that someone is biased to the conclusion that they’re wrong, they have committed a fallacy. It’s that simple.

In this post, I’ll give some examples of the fallacy, explain the fallacy, and then suggest how we should respond to the bias fallacy.

1. Examples of The Bias Fallacy

You’ve probably seen instances of the bias fallacy all over the internet.

In my experience, the fallacy is a rhetorical device. The purpose of the bias fallacy is to dismiss some person or their claims.

Like many rhetorical devices, this one is logically fallacious. So it’s ineffective. At least, it should be ineffective. That is, we should not be persuaded by it.

So if you’ve seen the bias fallacy online, then go ahead and set the record straight:

'They're biased, so they're wrong.' Not so fast! We can be biased without being wrong. #TheBiasFallacyClick To Tweet

And if you really want to have some fun, go ahead and join the discussion on Reddit.

2. Explanation of The Bias Fallacy

Let me clarify what the bias fallacy is. The bias fallacy has a certain structure. It identifies so-and-so’s bias. And then it infers from the mere existence of So-and-so’s bias to the conclusion that So-and-so is incorrect. So, in its simplest form, the structure of the bias fallacy involves two parts:

So-and-so is biased.

Therefore, So-and-so is wrong.

It is important to notice the that the bias fallacy involves a particular inference: the falsehood is entailed by the bias. This relationship is represented by the ‘therefore’ in the example of above. But it might be represented by all sorts of words and phrases in other contexts (e.g., ‘so’, ‘because’, ‘thereby’, etc.).

And that inference is the fallacy. It’s a fallacy because the inference is wrong: falsehood does not necessarily follow from bias. After all, someone can be biased and also be correct. Again, it’s that simple.

But we should be careful here. Take a look at two phrases with a structure that is similar to the bias fallacy. This time, however, the fallacious inference is missing:

So-and-so is biased

So-and-so is wrong.

These claims are entirely independent. There is no inference from one to the other. So this is not an instance of the bias fallacy.

3.  But bias increases our chances of being wrong, right?

At this point, you might be thinking that I am merely making some esoteric point that only academics care about. After all, bias is always bad, right? So even if bias doesn’t entail falsehood, it still increases our chances of being wrong…right?


Think about the ways in which someone can be biased. Sure, people can be biased against certain races, genders, classes, etc. We probably agree that those are bad biases. And those biases can lead to false claims — e.g., false claims about race, gender, etc. But this doesn’t mean that all biases are bad, that all biases lead to falsehood, or even that all biases increase the chances of being wrong.

Consider medical science. If we want to find out whether smoking causes cancer, then we are going to want to observe peoples’ smoking habits, peoples’ cancer rates, and perhaps some other related variables (e.g., peoples’ exposure to certain kinds of radiation, etc.). But there’s lots of stuff that we do not want to measure. We don’t want to observe people’s music preferences, favorite color, etc. Those variables are irrelevant to the causes of cancer. So good scientific investigation seems to require certain biases. In this case, it was a selection bias: a bias in favor of only relevant evidence.

And this is true more generally: good investigations should involve certain biases.

And, therefore, bias does not necessarily increase the chances of being wrong. In some cases, bias might actually decrease the chances of being wrong.

4.  What To Do About The Bias Fallacy

Imagine that you or someone you know witnesses you committing the bias fallacy. You erroneously inferred that So-and-so is wrong from So-and-so’s bias. What should you do now?

First, acknowledge the fallacy.

Second, acknowledge that it is a fallacy. (And, to be clear, the bias fallacy is usually a certain type of an existing class of fallacies — not a newfound fallacy.)

Third, reflect on the merits and demerits of the claim that you thought was wrong.

That is, consider whether So-and-so is right regardless of their bias(es). Some people are less likely to reflect about that (Jost & Krochik 2014; Linvill 2013; Linville & Mazer 2011, 2013). So some people might have to try harder than others.

But most of us will have to go way out of our way to properly reflect on the matter. We cannot simply go to our friends, family, and usual information outlets. After all, they probably have the same biases that we have (Halberstam & Knight 2016). So they’ll probably see only the merits and demerits that we see. Instead, we have to seek out people and institutions that disagree with us and/or have different biases than we do. Those people will be much more motivated to notice merits and demerits of the claim that we will overlook. So we will not have a full picture of the merits and demerits of the claim until we’ve genuinely considered their perspective.

Now let’s say that you’ve done these steps: you’ve admitted the fallacy and done the hard investigative work. So you can now clearly, cogently, and concisely explain why So-and-so’s claim is right or wrong. Now you’re ready for the final step.

Fourth, document your explanation of why So-and-so is right or wrong — so that you don’t have to redo it from memory every time the topic comes up. Better yet, make the explanation public so that others can scrutinize it and — if it passes muster — appreciate it.

5.  Take-aways

Being biased does not entail being wrong. It is possible that someone who is biased is wrong, but they are not necessarily wrong because they are biased. And — more importantly — biased people or institutions are sometimes correct.

So we cannot dismiss the claims of biased people and institutions just because of their bias.

Instead, we have to carefully and thoroughly evaluate claims on a case-by-case basis. And that is no easy feat. We often have to go way out of our way to do this. For instance, in 2016, I had to read over 1000 pages of investigative reports and talk to lots of people that I don’t usually talk to.



Halberstam, Y., & Knight, B. (2016). Homophily, group size, and the diffusion of political information in social networks: Evidence from Twitter. Journal of Public Economics, 143, 73–88.

Jost, J. T., & Krochik, M. (2014). Chapter Five – Ideological Differences in Epistemic Motivation: Implications for Attitude Structure, Depth of Information Processing, Susceptibility to Persuasion, and Stereotyping. In A. J. Elliot (Ed.), Advances in Motivation Science (Vol. 1, pp. 181–231). Elsevier.

Linvill, D. L. (2013). The Bias Fallacy. American Association of University Professors.

Linville, D., & Mazer, J. (2011). Perceived ideological bias in the college classroom and the role of student reflective thinking: A proposed model. Journal of the Scholarship of Teaching and Learning. [Free]

Linvill, D. L., & Mazer, J. P. (2013). The Role of Student Aggressive Communication Traits in the Perception of Instructor Ideological Bias in the Classroom. Communication Education, 62(1), 48–60. [Free]


Representative Art vs. The Real Thing: Which is more beautiful?

“…we think that the world would be improved if we could substitute for the best works of representative art real objects equally beautiful.”

G.E. Moore, Principia Ethica (§117,¶ 2)


I don’t buy it. 

Consider the statue of David. Now ask yourself, “Would this be more or less beautiful if it were an actual man standing on the pedestal?”

Continue reading Representative Art vs. The Real Thing: Which is more beautiful?

A Definition of ‘Fake News’ (and Related Terms)

If the public discourse in the United States is any indication, then people in the US mean different things by ‘fake news’. Naturally, then, it is time to agree on a definition of ‘fake news’. While we’re at it, let’s distinguish ‘fake news’ from other terms.

1.  Let’s Agree On Terms

As I see it, we will need to distinguish between at least three terms: fake news, conspiracy theory, and journalism.

A Definition of ‘Fake News’

Also known as “fictional news”. Characterized by outlandish stories — sometimes about paranormal and supernatural events. Any explicit claims to truth are obviously belied by their only semi-serious and comedic tone. Examples include many of the cover stories of the Weekly World News as well as some of the satirical punchlines of The Daily Show.

A Definition of ‘Conspiracy Theory’

Bad explanations designed to glorify their author and undermine the author’s perceived nemeses. Sometimes unfalsifiable. Alas, believed by many people. Examples are voluminous. Examples include certain explanations of the assignation of John F. Kennedy and InfoWars’ Alex Jones’s claims the Sandy Hook shootings were staged.

A Definition of ‘Journalism’

Continue reading A Definition of ‘Fake News’ (and Related Terms)

Research Questions & Mental Shortcuts: A Warning

Daniel Kahneman talks extensively about how we make reasoning errors because we tend to use mental shortcuts. One mental shortcut is ‘substitution‘. Substitution is what we do when we (often unconsciously) answer an easier question than the one being asked. I find that I sometimes do this in my own research. For instance, when I set out to answer the question, “How can X be rational?” I sometimes end up answering easier questions like, “How does X work?”. In an effort to avoid such mistakes, I will (1) explain the question substitution error, (2) give an example of how we can distinguish between questions, (3) give a personal example of the substitution error, and (4) say what we can do about it.

1.  Substitution

In case you’re not familiar with Kahnemen’s notion of ‘substitution’, here is some clarification. In short, substitution is this: responding to a difficult question by (often unintentionally) answering a different, easier question. People use this mental shortcut all the time. Here are some everyday instances:

Difficult Question Easier Question
How satisfied are you with your life? What is my mood right now?
Should I believe what my parents believe? Can I believe what my parents believe?
What are the merits/demerits of that woman who is running for president? What do I remember people in my community saying about that woman?

For further discussion of mental shortcuts and substitution, see Part 1 of Kahneman’s Thinking Fast and Slow (2012).

Now, how does this mental shortcut apply to research?  Continue reading Research Questions & Mental Shortcuts: A Warning