2: Open to Options
The author describes an experiment (Kahneman) in which people were asked to indicate the last four digits of their telephone number before estimating the number of doctors in New York city. It was found that there was a high degree of correlation between the two - people whose numbers ended in 7000-9999 guessed an average of 75% higher than those whose numbers ended in 0000-2000. The two have nothing to do with one another, but there's a clear bias.
Anchoring and Decision-Making
The tendency for one experience to bias the one that follows it relates to the psychological concept of "anchoring," in which a person's response is biased by carry-over from the recent past.
Laymen recognize anchoring when they delay asking for a favor of someone because they are in a bad mood - regardless of whether what they are asking is reasonable, a person who is anchored to a negative state of mind is less likely to consent to something they would typically have accepted (and if they are in a good mood, they are more likely to consent to something they might otherwise have refused).
Anchoring can also have more severe consequences - consider that a doctor who has recently read an article or heard a sales pitch for a given treatment is more likely to diagnose patients as needing that treatment, which may be a misdiagnosis.
Whether the outcome is better or worse depends on the situation, but in any case anchoring detracts from rational decision-making.
A few additional points are made:
- People tend to approach reasoning from an established set of premises and consider compatible possibilities - and don't tend to consider whether their premises may be wrong.
- How a person perceives a problem tends to shape how he reasons about it - that is, they assume one possible solution and consider how to achieve it, without considering alternative solutions
- The mental model is based on an incomplete representation of reality, and we often trade accuracy for speed
All of this results from the human mind's desire to get to an answer quickly. That is, we are inclined to use an efficient problem-solving process even if it is results in a poor solution.
In most instances, where we are solving a routine problem, the process does in fact yield a good solution. But when we encounter an unusual problem or unusual circumstances, it fails to recognize the outcome and/or precipitating consequences may not be good. So when the stakes are sufficiently high, we must pause to question the process.
Content with the Plausible
Another theory suggests that we are inclined to stop thinking once we reach a proposition we find to be plausible or acceptable.
For this point, the experiment is a simple question: what is the freezing point of vodka? The natural anchor for this question is the freezing point of water, and people reckon that alcohol has a lower freezing point. Asked a single figure, people average 12 degrees (Fahrenheit); asked a range, people range from -7 to 23 degrees. In fact, the freezing point for 80-proof vodka (40% alcohol, 60% water) is -17 degrees.
A second experiment was done with real estate agents, who were shown identical information about a specific house - only the listing price was more in one than another. When asked, less than 20% of agents admitted that the existing listing price influenced their appraisal at all, and insisted their assessment was completely independent. However, statistics told a different story: those who saw the high listing price came back with substantially higher appraisal values.
(EN: Both examples support the existence of anchoring, but neither seems to speak to the notion that the mind stops thinking when we find a plausible answer)
Anchoring is a valuable tactic in business negotiations involving amounts: the party that makes the first offer can anchor the negotiation against that amount (Galinsky).
Judging Books by Their Covers
Another significant decision mistake is to rush to a conclusion based on superficial information, either accepting or rejecting things we ought to consider.
The example given is a forest ranger who went to an emergency room because he was having chest pains. The physician immediately dismissed a heart condition due to the man's youth and good physical condition, and sent the patient on his way with the wrong diagnosis. The patient had a heart attack the following day.
In another example, a woman visits a doctor with a low-grade fever and high respiratory rate, during a time when there was an outbreak of viral pneumonia. While the symptoms didn't present quite correctly, the doctor assumed she was having a "subclinical" case of the same pneumonia, in which the typical symptoms had yet to surface. It turned out she had a regular cold and had overdosed on aspirin.
Even if we do not completely ignore or dismiss a possibility, we tend to dismiss it quickly if we believe it to be improbable. Said another way, we jump to the obvious conclusion rather than considering the full breadth of possibilities.
This is an inherent problem in heuristic diagnosis, which reduces complex situations to a simple decision-tree approach of yes/no questions. Not only does this ignore unusual situations, but it cause us to avoid asking questions that would lead to a more accurate diagnosis.
Imagining Patterns and Trends
Another common mistake people make is to extrapolate inappropriately from past results - that is, to assume that patterns will continue to repeat and trends will be followed. Neuroscience experiments (Huettel) suggest that electrical patterns in the brain begin to anticipate repetition: a person who seems the same shape twice in sequence expects to see it a third time.
There is some suggestion that we are academically trained to look for trends in data, but it's likely that it is much more deeply ingrained: given that human beings have no instincts and we must be efficient at making association between sensory perceptions and dangers or opportunities in the environment - some instances are so traumatic that one occurrence leads us to expect a repetition. We generally acknowledge that the past does not predict the future, but we often act as if it did.
Compounding the problem, the human brain is so eager to find patterns and connections that it often does so in irrational ways. The treatment approach of foundational figures in psychology, Freud and Jung, focuses on discovering and rewiring these irrational associations - talking to the patient to uncover the source of crossed wires so that they can be dismissed by rational analysis.
Rationalization, Confirmation Bias, Selective Attention, and Stress
Festinger's theory of cognitive dissonance hold that when a mind encounters two contradictory thoughts (ideas, attitudes, beliefs, opinions, etc.) it cannot accept that both may be right and seeks to accept one and dismiss the other. Rationalization is the process of making such things fit.
For example, a man may recognize that wearing a seat belt improves safety, but doesn't want to do it, will convince himself that his driving ability is good enough to keep him safe, and as such he's right to avoid making a habit of buckling-up. It's clear for anyone else to see that he is deluding himself, but he will cling to such a rationalization. We accept that rationalizations are harmless when it harms no-one else, but the author suggests that "a scan of history" will reveal the harm done by people who justified their own behavior as they ignored the consequences.
The author mentions the case of religious scientists, who are strongly conflicted between the scientific evidence they find and the religious beliefs they hold - hence the vehemence of the debate between evolutionism and creationism, and the way that the silliness of creationists who attempt to reconcile the two is glaring to everyone but themselves. She goes on to discuss a few other religious examples, including suicide cults, which effectively require adherents to accept a fiction and turn off their conscious, rational mind.
Confirmation bias is a similar phenomenon, though it tends to occur after a decision has been made and the individual ignores any information that is inconsistent to the conclusion. (EN: an alternate theory is that a person does not pointedly ignore this information, they merely do not notice it because they are "done" thinking about it and are no longer collecting information.)
A reference to media studies: when radio came out, there was a great deal of concern that the "vulnerable public" would be subjected to a flood of new information and that civil unrest would occur if masses of people received the same subversive message all at once. Some sociologist of the time (Katz and Lazarsfeld) refuted this view - suggesting that people would not be automatically receptive to new ideas, most would simply ignore anything that didn't jibe with their existing beliefs, which is what occurred.
Another experiment (Westen) was conducted among political partisans, staunch Democrats or Republicans, who were shown slides containing inconsistent comments from Presidential candidates. Each was attuned to the inconsistencies in the opposing party's candidate, but not to their own. Observing electrical activity in the brain - they did not employ conscious reasoning when they saw information they disagreed with, but when they saw what they liked, their brain had a positive emotional reaction - in effect, their rains were rewarding them for recognizing what they already believed.
The author also mentions stress. In small amounts, every once in a while, stress can help to focus the mind intently on a task at hand, which is extremely valuable to making split-second decisions in survival situations. Aside of the physical and mental effects of high stress or long-term stress, stress is extremely counterproductive to making sound decisions: it leads people to choose options that immediately effective in decreasing the stress, not those that are effective in solving the problem.
Pursuing Incentives
In some situations, a person is expected to make decisions for the sake of another person or group, but our natural inclination is to consider the consequences of an action to ourselves. While we can be wary of it, it is a conflict of interest that will compromise a person's ability to properly consider alternatives.
(EN: I've seen the argument swing the other way, suggesting that current trends in culture are designed to swing a person's judgment too far in the opposite direction, such that they end up with a kind of martyr complex, choosing the option that is most detrimental to themselves in order to be praised and rewarded for having done so - so arguably it's self-serving in a different way. In either case, the problem remains that they are not properly focused on assessing the benefit to another party.)
An anecdote is related in which a presenter asked a group of surgeons which of two options they would use on a patient, then asked them which they would recommend to a family member. The surgeons chose a different option based on the question - they preferred to perform the newer procedure, even though they clearly felt the older procedure was safer for the patient.
The author also uses the recent problems with subprime lending as an example of self-interest: people with poor credit scores could get nice homes, lenders could earn fees and collect higher interest payments, securitizing the mortgage was a profitable business, etc. Virtually every player in the crisis was motivated by their immediate self-interests, and none considered the consequences that would, they assumed, fall upon others when things went sour.
Another experiment was done among chartered accountants: an experiment in which they were asked to review a number of accounting scenarios and assess whether they were in compliance with proper accounting practices. The half of the group that was told they were doing this for their own company felt that 30% more of the cases were acceptable than the half that was told it was at a different firm. Since it was a hypothetical scenario, the researcher reasoned there was no incentive for them to skew the results in favor of their own firm, but yet many did all the same.
Avoiding Tunnel Vision
The author provides five tips:
- Explicitly consider alternatives. Even when it seems an obvious choice, or especially when it does, pause to ask what other options might be considered.
- Solicit dissent. Ask questions that could elicit contradictions to your assumptions - and instead of ignoring contradictory information, be attuned to it.
- Review previous decisions. People tend to fall into patterns, and keeping track of past decisions helps us to become aware of them so we can be more deliberate in our future decisions.
- Avoid making decisions while at emotional extremes. Neuroscience upholds the aphorism that people are subject to their moods, and emotion clouds reason.
- Recognize incentives. When making decisions that are for the benefit of others, consider what benefits you may gain from the outcome, even if self-service is not your intent, as well as benefits or detriments to people outside the group you intend to serve.