Alan Jacobs set up a splendid newsletter that delivers tiny, little presents to my inbox each week (you can sign up for it here). His latest email included a link to a piece by John Timmer called “Political radicals don’t evaluate their own errors—about anything.”
In the article, Timmer outlines an experiment that asked participants to guess the number of dots in a specific area. They were also asked to describe how confident they were in their answers.
You can learn more about the experiment here, but the researchers found that the people who held more radical beliefs (“all participants were given a set of questions about their beliefs on a variety of topics, designed to get at things like their dogmatism, intolerance, and authoritarian tendencies”) were less likely to approach their opinions with humility. Most of them also decided against changing their answers even when presented with new information that would help them make a more accurate guess. Timmer writes:
The authors sum up their work by writing that “more radical participants displayed less insight into the correctness of their choices and reduced updating of their confidence when presented with post-decision evidence.” That’s a bit surprising, given that most research into this area has focused on the ideas themselves and suggested that confidence is simply a mechanism for protecting those ideas.
This is not an easy topic to dissect, and likely requires more chiseling to get to the root of the problem (a black and white analysis of the cause and effect would probably hurt more than help). But it does highlight the need for more humility in thought and decision-making. Recently, I heard Daniel Kahneman on the Conversations with Tyler podcast talk about the need for “delayed intuition.” Meaning, it’s often healthy for us to keep from making an instinctive, reflex decision before we hear more facts. In other words, let some time pass before you trust your gut. Obviously this doesn’t apply to all situations (athletes and chess players must learn to act quickly on their intuitions), but it does in most.
Here’s the takeaway for me (and maybe for some of you): I hold strong beliefs in certain areas. I’d like to think that I’m right (I’ve also done a great deal of research which leads me to believe I’m right). Even if I am, I must realize that certainty in one realm may also point to a tendency to jump to conclusions and prideful stubbornness in another. I need to give attention to my intuitions—which I hope I’m developing over time—but also practice delayed intuition whenever possible.