The Sin of Certainty

Screen capture from wikileaks video, Iraq

Lane Wallace wrote a recent article in The Atlantic about bias in journalism, and how it influences the questions that one asks. As journalists are human beings, I think we can reasonably assume that they, like everyone else, have biases. What resonated with me was Wallace’s use of the phrase ‘the sin of certainty,’ coined by the science writer Jonah Lehrer. I think this phrase has great utility and should humble us when we consider how much we don’t know. As an example, chemists have just observed for the first time a new yet-to-be-named element (#117). It took until the late 1990s for physicists to recognize that perhaps up to 95% of the universe is actually comprised of dark matter and dark energy. These are far from trivial matters, yet they have eluded us until now.

It stands to reason that if everything that we previously believed constituted the entire universe actually only makes up 5% of it, then we likely have some other very serious gaps in our knowledge and probably always will. As Richard Feynman once said, “I think Nature’s imagination is so much greater than man’s that she’s never going to let us relax.” That’s OK. Most people like a good mystery. However, we are often hesitant to think in such terms, that there is much to be known, and that there is a chance (probably a very good chance) that we might be wrong about at least some of our most cherished beliefs. Instead, we have a tendency to seek information that reinforces our previously existing beliefs (e.g., ‘confirmation bias’ or ‘groupthink’). Nick Kristof described the tendency of people to read online media that corresponds with their pre-existing political biases as ‘the daily me’:

there’s pretty good evidence that we generally don’t truly want good information — but rather information that confirms our prejudices. We may believe intellectually in the clash of opinions, but in practice we like to embed ourselves in the reassuring womb of an echo chamber.

The tendency to believe what we want to is not confined to politics, but also to other spheres of knowledge. The recently leaked story about the American helicopter pilots who fired on a group of Iraqi non-combatants, including two Reuters journalists, illustrates the limited capacity of the human mind to process finite amounts of information. Believing the individuals had weapons such as rocket-propelled grenades, the pilots convinced themselves that the group was made of insurgents and were therefore acceptable targets. Within the fog of war, mistakes have life-and-death consequences, including whether wars should be fought in the first place or further escalated, as in the case of the Gulf of Tonkin incident during the Vietnam War.

In describing the many ways that our brains are unable to overcome visual and cognitive illusions, the behavioral economist Dan Ariely refers to human decisions as ‘predictably irrational’ (note: everyone should watch this video – it’s brilliant). This is to say that much of our decision making is contingent upon circumstances, be it through culture, the way a question is asked of us, or even what choices are available. His examples are compelling, and I won’t rehash them here, though I cannot help but to make a parallel to a study done on a group of captive capuchin monkeys (Brosnan and deWaal, 2003). Researchers trained the capuchins to exchange a token for a reward, usually a cucumber. However, some monkeys were intentionally given a grape instead (a more desired food), and sometimes for free (without exchanging a token). Other monkeys who witnessed unfair rewards for their peers and were then offered the standard cucumber refused to further participate, wouldn’t eat the cucumbers, or threw them at the researchers.

The upshot is that monkeys seem to have a sense of fairness, but this was contingent upon what other monkeys got. After all, at one point a cucumber was a perfectly fine reward, but why remain content with that when better rewards are available and have been given previously to your peers? Context matters, even for capuchins. We think in similar terms. At one point, our salaries or hourly wages may be acceptable, until we learn that so-and-so is making more for doing the same job.

It seems unfortunate, but almost unavoidable, that we are predisposed to think that the beliefs we hold are inherently superior to those of others (of course we do – otherwise we wouldn’t hold those beliefs in the first place if we thought there was a better option available). In the realm of exchanging ideas, it is necessary to retain a healthy mix of confidence and humility in what we know. I don’t think it’s very helpful to take this too far and suggest a nihilistic approach – that we can never know anything. Otherwise, what’s the point? Plus, there’s just far too much evidence to the contrary; we can make informed observations and predictions about the way nature works. However, every once in a while it would seem like a good idea to pause and reflect on how flawed –  though nonetheless wondrous – our evolved brains are. It should also be acceptable to simply say “I don’t know” rather than to obstinately hold fast to an ill-informed opinion (political pundits could really use a lesson in this).

I might be biased (obviously), but I believe that this is where a scientific approach shines. At a fundamental level, what science has going for it is the recognition that all conclusions are tentative and contingent upon future discoveries that could potentially overturn what we know today. We need to be prepared to change our minds should new evidence arise. That’s often easier said than done, but the principle still stands. When the economist John Maynard Keynes was once criticized for being inconsistent, he replied, “When the facts change, I change my mind. What do you do, sir?” Stubbornness and blind adherence to one’s cherished beliefs are nothing to be admired when they are unwarranted.


Brosnan SF, deWaal F (2003) Monkeys reject unequal pay. Nature 425: 297-9.

3 thoughts on “The Sin of Certainty

  1. This is a great blog post!

    I love the thought put into this as well as all the connections you made.

    There are 3 types of listening according to the Art of Living workshop.
    1. Intellectual. This is the part of us that is constantly saying “yes” or “no”, agreeing or disagreeing.
    2. Emotional. This is when someone like a motivational speaker talks and you get all vamped up and you leave and go to tell everyone how great it was and they ask so what was it about? and you cant quite remember any of the details.
    3. A combination of the two. This way you get the best of both worlds and knowledge can sink in.

    If you listen only with your intellect, as soon as you think you know something you stop listening and maybe even start trying to convince others around you of your rightness. If you listen only with your heart then you don’t really retain anything and the goal of knowledge (to make us act) becomes useless. When you combine the two you hear with openness and allow yourself to listen with passion. Then and only then can we take that information and do something that could change our lives with it!

    Thanks for sharing and letting me share too!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.