20 - Cognitive Biases
Cognitive biases are psychological tendencies that lead to distort perception, lead to misinterpretations or illogical interpretation, reduce the accuracy of judgment, and generally introduce an element of irrationality into critical thinking.
Even highly intelligent people are affected by cognitive biases, and they can be difficult to detect and avoid. Being aware of them may be of some value, though awareness of a bias does not guarantee we will recognize or acknowledge when we are being biased. It is generally easier to witness bias in someone else's argument.
Memory Biases
Memory bias leads us to place emphasis on things that have happened more recently, or which made a more vivid impression on our memories, which is likely not the most frequent or probable events.
For example, people tend to overestimate their chance of being attacked by sharks while swimming and underestimate their chances of drowning in a bathtub. This has more to do with fear or drama than rational thinking. (EN: Checked up on this. In 2012, 317 people drowned in tubs and 47 were attacked by sharks, with only two fatalities.)
People also give much more weight to anything with which they have had a first-hand experience, even when it is contrary to statistics. Anecdotes and hearsay from friends are also taken as more credible than research findings.
Our optimism/pessimism about an outcome leads us to think that our expectations will be fulfilled and to be inattentive to or dismissive of evidence to the contrary.
Context Bias
A context bias occurs when irrelevant features of a situation in which a judgment is made bias the outcome.
One example of context bias is that naming a large number in the context of a question will cause estimates to be larger. Asking "How many Americans ..." will generate lower estimates than asking, "Out of 300 million Americans, how many ..." because the latter sets the context of larger numbers.
It need not be in the context of the question - the Ariely experiment at MIT merely asked participants in an auction experiment to provide the last two digits of their own social security number. Those whose numbers are higher routinely offered higher bids, sometimes by as much as 100%, than those with lower numbers.
Another example was an experiment by Northcraft and Neale (1987), which asked real estate agents to appraise a home based on information that included its listing price, which was lower or higher on the same properties. Naturally, those who saw higher list prices gave higher estimates, though they steadfastly denied having considered the list price in their assessment. Another researcher (Galinsky 2004) observed of business negotiations that the first party to name a price skewered the negotiation in their favor, contrary to the traditional advice to let the other party make the first move.
Another common context bias is the framing effect, which demonstrates that the way a question is phrased can influence the responses. For example, people are more inclined to act to avoid pain than to acquire pleasure, so the way that a question is phrased will influence their decision. If a patient who needs an operation is told that it has a 90% survival rate, they are more likely to agree to the procedure than if they are told it comes with a 10% chance of death. The odds are exactly the same, but the way the issue is framed differs.
Another study (Hossain and List 2009) demonstrates that workers who are told that they will receive a bonus will work harder to avoid it being taken away than those who are told they might receive a bonus if they meet certain production goals. (EN: This may speak more to their assessment of credibility of their firms.)
Evidential Failures
A number of biases arise from failure to use information and evidence correctly.
The confirmation bias, which is very persistent and well documented, suggests a tendency to interpret the world to fit our existing believes, ignoring and neglecting counterevidence. This can sometimes be entirely deliberate, among people who wish to "win" an argument by ignoring evidence to the contrary, but it even affects more sober investigations in which a person is generally more interested in supporting evidence and does not spend adequate time seeking out opposing viewpoints.
Confirmation bias is strongly evident in superstitious beliefs: if a person believes that Friday the thirteenth is an unlucky day, he will be more attuned to give more attention to unfortunate events than fortunate ones that happen on that day.
There is also the "belief perseverance effect" that is fairly similar: when we choose to believe something, we will often keep believing it and ignore or dismiss any contrary evidence. The author refers to a "famous study" (for which he cannot name the source) in which students arguing over capital punishment were provided with the same set of data, yet each side found the data supporting their own proposition more convincing than disconfirming data, and even state a higher level of confidence after seeing information that contradicted their positions.
Ego Biases
Ego biases stem from our self-perception and our desire to see ourselves as better than others, particularly in terms of our ability to evaluate evidence and draw conclusions.
Rationalization is a common example, in which we find excuses to justify our actions on grant ourselves greater self-esteem. A person who does something unethical will stretch to find reasons their behavior was ethical. An investor who makes money in a rising market feels that it's due to his own financial savvy - and when he loses money, it's the market that's to blame.
There is also the matter of overconfidence, which leads us to make hasty decisions without due consideration the circumstances or the evidence. The author refers specifically to the "above average" effect, which often evidences itself.
- More than half of drivers consider themselves to be better and safer than average
- Business managers usually regard themselves as more capable than typical managers
- Most students consider themselves to be more popular than average
- More than half of students taking a test think their results will be in the top half of the group
The above-average effect is particularly prominent when we think that something is easy to do. Conversely, if something seems difficult, people tend to underestimate their ability instead of overestimate it.
A closely related effect is the optimism bias - the tendency for people to expect their plans to work out. This includes students who feel they will pass an exam, or will get a job offer. It is also seen in the variance between the estimated and actual cost and time of projects: the larger the project, the greater the deviation, because the predictions were optimistic rather than realistic.
There is also the power bias, in which people in positions of authority (mangers, politicians, teachers, clergy, etc.) are much harsher in judging others than themselves - this occurs both before the fact (believing they will do better than they will) as well as afterward (rationalizing poor performance or redefining success).
Combating Cognitive Biases
Cognitive biases are often seem as inevitable - by definition, we are unaware of them, so it seems impossible to account for them. However, we can strive to improve our awareness of them. Some suggestions:
- Be aware of what cognitive biases are and how they arise
- Be more systematic in your approach to gathering, analyzing, and interpreting information
- Think about how other people might respond
- Consider different perspectives, or formulate the question in a different way
- Actively seek out contrary evidence and unpopular alternatives
- Talk to people who disagree with you
- Plan sufficient time to think through a problem and avoid hasty decisions
- Gather user feedback and experience
- Consider the reasons why you believe what you believe
- Leverage statistical analysis where possible