Or are we?
For my medical friends, we see it in the American patient predilection for paging Dr. Google. For those of us who follow the environment, we see it in the denial of climate change. And we ALL saw it in the 2016 election in the form of the echo chambers on both ends of political spectrum.
What is “it”?
Confirmation bias- that big, bad cognitive stumbling block that allows us to completely ignore information that doesn’t support our ideas/ opinions/ “facts”. Our brain is amazingly gifted at dispensing with inconvenient information, and confirmation bias, and neuroscience shows that we tend not to look for information that challenges our beliefs.
Think about that for a moment. When is the last time that you actively sought information that doesn’t align with something that you believe about the world? Be honest here. Oh, and think about how much you learned the last time that you did make that effort (because we know that confirmation bias limits our learning!).
It seems that the current political and social environment in the US has resulted in a flurry of writings about confirmation bias, particularly its impact on science and policy. While confirmation bias used to be a phrase that we mostly used to describe a failure to maintain equipoise as an investigator, it’s become a key part of the 2017 lexicon, particularly with the advent of terms like “alternative facts.” I have to admit that I’m not sure I feel any safer about people embracing alternative facts when we know that facts don’t change our minds.
Another often underestimated aspect of confirmation bias comes from deeply held personal values. Be it the risks vs. benefits of drilling for oil in the Arctic or how Planned Parenthood actually spends taxpayer dollars, many individuals have values-driven opinions that impair their ability to have meaningful dialogue around these topics (myself included at times). Instead, everyone fancies themselves an expert on topics that they aren’t, and instead of intellectual, meaningful dialogue we get meaningless and unhelpful shouting matches. Suddenly we’re back to that echo chamber of the 2016 election…
The medical tie-in for confirmation bias is, of course, when it impacts our diagnosis and management of patients. (Note: if you haven’t read Jerome Groopman’s How Doctors Think and you are in the medical field, you are doing yourself and your patients a disservice.) Certainly we base a great deal of what we do upon pattern recognition. But what about those times when the patterns lead us down a primrose path that is…wrong? It happens, even to fabulous clinicians.
How do we overcome confirmation bias if it is so insidious?
Simple. Seek proof that what you’re thinking is a terrible idea. Look for disconfirming data. Conduct small experiments that are capable of disproving OR proving that your idea is right.
In other words, prepare to be wrong from time to time. It’s part of our human experience.