The Power of Critical Thinking: Bias and Self Deception
Some time ago, during one of those post-conference informal discussions, I heard a fellow give the best description of bias that I've ever heard. I will attempt to paraphrase:
Bias is the natural result of every person's self-centrism. We all - except for the few who are incapacitated by self-doubt - are utterly convinced that our beliefs, actions and opinions are absolutely correct at the time.
The caveat "at the time" is an essential part, because all of us (except for those who are pathologically incapable of acknowledging error) have had occasion to realize that we have made a mistake, that our beliefs, actions and/or opinions were wrong.
However, at the moment we thought that, we were convinced we were correct - correct about us having previously been in error.
You can see how this could rapidly get out of hand.
But it holds together very well as a general statement of how people view their own thinking process. We (with very few exceptions) are sure that we are doing or thinking the "right" thing at the time that we are doing it or thinking it.
So, what has this got to do with bias? Plenty.
Bias is a sort of "blind spot" in a person's thinking - a place where their assurance of being right makes them vulnerable to imagining the world to be different from how it truly is. It is, in short, a minor delusional state.
As Mark Twain is reported to have said: "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so." So it is that the blind spot of bias gets people into trouble. Ignorance is merely a need for information or education; error - especially when it is believed wholeheartedly - is a calamity waiting to happen.
A good analogy would be a map. If your map of reality has blank spots, you are likely to be more cautious in those areas. You'll ask questions, listen to what people who have been there have to say and keep you eyes open. On the other hand, if your map has roads where there are none and smooth plains where there are cliffs and pits, then you are standing into trouble.
Bias can lead to self-deception (more about that later) when it convinces a person to ignore good information because it conflicts with their pre-formed (and self-corroborated) view of reality. Some spectacular examples of this have occured in the media in recent years, when otherwise sage and seasoned journalists were completely taken in by fakes and phonies. That they often did so in the face of evidence that they were being hoodwinked only serves to underscore the pernicious nature of bias.
As an aside, it is laughable in the extreme to see people state that they are "not biased". Ridiculous! Everybody is biased in some way - everybody has an opinion on everything, even if that opinion is "That's too unimportant to think about." As a result, everybody is prone to - biased toward - a particular view of reality.
To reiterate: no matter what you think, you think that's the right thing - at that time. If it were a conscious decision, it wouldn't be bias. Bias influences, shades and slants conscious decisions, in ways that we are not aware of at the time. Looking back, we may be aware of how our assurance of correctness led us to disaster, but we don't - we can't - see it at the time.
To be sure, people are often aware that they are doing or thinking things that "aren't right", but that is the imposition of an external measure of correctness. It has nothing to do with whether the person feels that their thoughts or actions are right for them, in their situation, at that time. The person who is shoplifting a loaf of bread knows that it is a crime (possibly even a sin), but is assured that they are doing the right thing for their own reasons.
Bias is usually a necessary precondition for self-deception. It would take an incredible strength of will to truly believe something you know to be false. I'm not sure that it can be done this side of insanity.
However, humans have shown themselves to be masters of convincing themselves that the real is false and the false is real. And the first step in this process is to be assured of the correctness of one's own thinking - our old friend bias!
In some ways, it's rather amazing that people can ever avoid self-deception. What keeps most people sufficiently grounded in reality to allow them to carry out the day-to-day activities of living is a regular contact with aspects of reality that cannot be easily ignored.
Someone who has convinced themselves of their ability to fly (sans aeroplane) will rapidly (at 9.8 m/sec/sec) be disabused of this notion. They may even survive to "internalize" the lesson. Likewise, reality intrudes its unwanted self into most self-deceptions, sometimes, sooner, sometimes later; sometimes subtly, sometimes dramatically.
But there are some self-deceptions that are resistant to reality therapy. Some of them lack a sufficient grounding in reality to ever run across a contradiction. Most religious thought is of this nature. There is simply not enough contact between religion and reality - with some notable exceptions - to provide a convincing "whack" to the deceived... er, the devout.
Other times, the collision with reality is a long time coming, allowing people a "grace period" to believe (and perhaps entrench) their self-deception. For example, people who try quack remedies for real illnesses often have a period of time before the reality of their unchecked disease breaks through their belief. Even worse off are the people who try quack remedies for imaginary illnesses - they will now have two mutually-reinforced self-deceptions!
Additionally, there are some counter-reality thoughts that can diminish the impact of reality. Some of these are (not an exhaustive list by any means):
- It would be so embarrassing if I were wrong!
- I've invested too much time/money/effort/reputation on this to admit that I was wrong!
- People would be so mad at me if I were wrong!
- I really trust the person who told me this...they can't be wrong!
- I really need for this to be right!
You'll notice that I haven't included anything like "It's all a conspiracy!" or "They're lying to me!" That's because these sort of thoughts are the result of self-deception (and might even be considered diagnostic of it) rather than contributing factors. Once a person has convinced themselves that the cost of being wrong is (for them, at this time) greater than the cost of persisting in error, then the mind will generate a suitable set of excuses (sometimes called "rationalizations" or even "delusions") in order to maintain their denial of reality.
Rationalizations are a way of covering up the gap between reality and a person's conception of reality. The more their mind-generated world-view comes in conflict with reality, the more rationalizations are needed. At some point, they may even cross the imaginary line between denial and delusion.
People who are in denial are often very hostile toward people who try to bring them back to reality. After all, it's hard work to maintain all those rationalizations and they do not appreciate visitors who want to track reality all over their snug, safe sanctuary. They will often lash out at people who have the temerity to disagree with them.
As a result, people in denial often seek out others with the same - or similar - world-view. The Internet has aided this process immensely, providing innumerable places for people to gather to share, refine and reinforce their denial of reality.
Avoiding Self Deception:
While it would be nice if we could all look reality in the face at all times, this simply is not the human condition. Failing a complete refit of the human psyche, what can we do to avoid self-deception?
 Check your ideas with someone else.
This is the basic concept behind "peer review" in scientific journals. Your ideas, methods, data and conclusions are put before a number of independent (they aren't all sitting in one room, influencing each other) reviewers who read it and give their critique (and usually criticisms). This provides a number of minds that - in all probability - do not have the same biases (the same blind spots) as the author(s).
For the average citizen, "peer review" can be a bit harder to find. A common mistake is to ask someone who already thinks as you do to critique your idea. Thus we have UFO conspiracists asking other UFO conspiracists if the small sharp thing in their bum is a splinter or an alien mind-control device, with the predictable bizzare answer.
This is also where a lot of the news media have fallen foul of self-deception. In a group with the same political and social mindset, it may be hard to find someone to say, "Gee, Dan, that memo doesn't sound very believable to me. Maybe you ought to check it out better." It's also unreasonable to expect people to tell the boss that they think he's gone crackers. Better to find an independent appraisal.
 Be skeptical of everything you hear, especially if you agree with it.
Again, the news media would have saved themselves a number of black eyes and bruised egos if they had followed this simple rule. For that matter, a number of people in science would have been better off if they, too, had heeded this advice.
You are very much more likely to be taken in by a falsehood that conforms to your world-view (your biases) than to believe a truth that doesn't. Keep that always in your mind. Question the basis for any statement that claims to be "fact"; question it twice if you find yourself wanting to believe it.
 Occasionally step back and ask yourself, "What would it take to make me believe/not believe this?"
When we are self-deceived, the answer to this simple question - if we are honest with ourselves - is most often "There is nothing that would change my mind." This is a very bad answer because it means that your belief (or non-belief) in something is a matter of religious devotion, not reason. If that's OK for you, so be it. But it might just prevent a nasty misstep if you can recognize that you are not seeing the world as it really is.
This has just scratched the surface of a topic that could fill volumes (and has!). Perhaps I'll add another volume to the world's collection someday. It's on my list (right after cleaning out the attic).
Until next time!