The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes.Why does this happen?
One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.Eh, I don't know. My guess (and this comes from someone that considers themselves to be smart) is that there is a certain amount of pride that gets in the way. When I come across a math problem I'm often quick to go for the solution, even if it means that I skip or ignore various steps. I want to show off how smarty pants I am and get there first. I doubt I'm alone in this. Now philosophical and ethical questions are a bit different. They are still prone to bias errors but they depend more on value judgments. The article doesn't go into this, but there is at least one good way to get past those biases: work with a group. The more perspectives you bring, the more likely you are to see different angles. (That's one of the reasons I'm glad that I've found other people to talk with about the Great Books!) The moral? Be careful out there, I guess. Look for articles that you disagree with. Find smart people who will oppose you. And be humble and willing to admit error.