Introduction: Smart Is as Smart Does
The author opens with an ironic incident: Stephen Greenspan, a "bright and well regarded" man who wrote a book on gullibility, "Why We Get Duped and How to Avoid It," lost a third of his retirement savings in a ponzi scheme.
It's not an isolated incident: the media often calls our attention with great delight to intelligent and distinguished people who should know better, but who nonetheless make gormless decisions with horrific consequences to themselves and others. A few other examples are provided:
- The Long-Term Capital Management fund (LTCM), which had years of strong performance, lost over $4 billion because they failed to consider price fluctuations.
- The explosion of the space shuttle Columbia was traced to a piece of foam insulation that fell off: it was an issue that had been identified in the past, and was completely overlooked by the engineers.
- Iceland had the most rapidly growing banking system "in the history of mankind" - and then went on a "debt-fueled spending spree" that collapsed its largest banks and dropped the national currency by 70% ... in effect, "the country ran off an economic cliff."
No-one makes a decision to do the worst thing conceivable, but bad decisions are made with frequency, even by intelligent people who should have known better. In fact, smart people make poor decisions for the same reasons considerably less intelligent people do.
The author cites a psychologist who argues that traditional IQ tests do not measure the mechanisms that lead us to make sound decisions. Such tests measure perceptive and cognitive capabilities - but mental flexibility, the ability to assess evidence, and other factors that are requisite to sound decision-making are largely, if not utterly, absent from traditional measures of intelligence.
Mental flexibility is of particular importance: our minds become set in their ways and our perception becomes limited - it's much like an optical illusion, in which a mind is geared to look for certain visual cues to identify something, and will ignore other cues - such that it takes some effort to "see" the shape that's hiding in plain sight, even when someone else points it out to us.
To see the hidden shape in an optical illusion, or to recognize the problem we're not seeing in the data, requires us to think twice - to step back from a knee-jerk reaction, un-focus our minds, and see things in a different way. The human brain, which strives for efficiency, wants to resist doing so.
Even when people work together, the point of which is to gather multiple perspectives, the tendency is to quickly seek consensus - to move forward on the first thing that comes to mind, and to ignore or argue against any suggestion to the contrary. In doing so, we immediately lose the benefit of teamwork and the expertise of others.
There is tragedy in situations where we realize we failed to do this - but great opportunity for improvement if we recognize that this is our tendency, and work to change our habits to overcome the problem. That is the point of the present book.
The author has developed a process that involves three steps:
- Prepare. The first step is to be mentally prepared - to understand the kinds of mistakes we make and the reasons we make them, and to find the humility to recognize that we must change from the ways of thinking we assume to be productive.
- Recognize. The second step involves recognizing problems in context: the kind of problem, the risks involved, and the tools that can be used. Mistakes often arise from failure to notice that something is different about a given situation, and a different approach is necessary to succeed.
- Apply. The most important stem in avoiding mistakes, rather than merely analyzing those we have already made, is in learning to utilize these tools in the process of decision-making, in real time and the real world.
And given what happened to Greenspan, the author feels it important to concede that he is not at all immune from making cognitive mistakes. No one is perfect and makes flawless decisions, and it's likely an ideal that cannot be achieved: but what we can and must do is strive to become less imperfect, and make better decisions.
Experiments in Rational Decision-Making
The author mentions an experiment that he has done - presenting a jar of coins and asking students to bid on it: the winner is not the most accurate estimate, but the highest bet. Naturally, this requires the bidders to estimate the value and provide a bid below it, such that the exchange is profitable, but there is still pressure to be the highest bidder to "win." Invariably, the winner has paid more for the coins than they are worth.
This turns out to be true in many auction situations, whether bidding for a specific item, purchasing a share of stock, or making a tender offer in a corporate acquisition. Whether people make poor estimates of the value, or whether they are caught up in "winning" and ignore the financial aspects, they make very bad decisions. (EN: In fairness, the seller is also to blame - where things work out in favor of the buyer, the seller made a bad decision.)
Another experiment considers whether how confident people are in the wrong answer. In theory, we should have greater confidence in our answers when the answer is closer to correct, but in practice, there is much weaker statistical correlation, such that we are as likely to be confident in an answer that is way off the mark as we are in an answer that is only slightly wrong. As such, we can't trust our "gut" to be an accurate guide to when we need to do more careful thinking.
The next experiment is simply a failure of rationality: a group of people are asked to write a number from zero to one hundred, and the winner would be the person whose guess is closest to two-thirds of the group's average. Most participants immediately begin doing calculations - when the rational course (obvious when you consider "two thirds of the average" encourages everyone to write a low number) is simply to write zero. (The actual answer generally works out to be between eleven and fourteen.)
The Magic Square: Making Hard Problems Easy
The author mentions another experiment, by way of demonstrating how a difficult task can be made more simple: it's a game in which two players pick three cards with a goal of adding up to exactly fifteen while preventing your opponent from doing the same. It's similar to tic-tac-toe in that it usually results in stalemate unless one of the players makes an obvious mistake.
The point of the experiment is to witness how much effort people put into thinking about how to play it, keeping track of the cards they have and those their opponent has chosen - they put a great deal of deliberation into it.
It's really quite simple if you consider the following "magic square" diagram, in which the rows, columns, and diagonals show all possible combinations of cards that would add to fifteen:
8 3 4
1 5 9
6 7 2
Once you recognize that pattern, you should be able to decide very quickly which cards to take to thwart your opponent.
The point is: people all into familiar mental patterns, and once they think they have a solution to a problem, they will no longer see alternative approaches that would be simpler and more effective. Our brains are trained to find a strategy and act on it, and resist going backward.
People who act quickly and do less thinking feel that they are ahead of their competition - but akin to running very quickly away from the finish line.
Process or Outcome: Which Should You Focus On?
Three factors determine the outcome of a decision: how your think about the problem, which actions you take, and factors you are unable to control. Given the third is unknowable, this means two of these factors are under your control.
We are inclined to focus on outcomes, and seek to control the actions we take. After all, the results we achieve will depend on the things we actually do. This is the majority perspective, and it is the exact reason things tend to go wrong.
The most challenging decisions include an element of uncertainty, and leaping to action ignores the possibility of unintended outcomes. We may luck out of things work out just right, including things beyond our control, but we will on occasion end in disaster.
A quick analogy is basic strategy for blackjack: hitting an eighteen against a dealer's six is contrary to statistical probability. But if you take the card and happen to draw a three, you win. In arrears, it looks like you made the smart move - but in reality, you made a very bad decision and were saved by luck.
Where you consider probabilities, you can better focus on the various outcomes that may result, better assess the options at your disposal, and make a more informed decision. But it doesn't mean that you will always win: considering the blackjack scenario, if you passed and the dealer took the three, you would lose in spite of having made the right decision.
But over the long run, probabilities will level out. Luck will fail you far more often than it will save you, and those who rely entirely on luck may enjoy good fortune for a short time, but it will eventually run out.
About This Book
The author defines the primary audience for this book as investors and businesspeople, though the concepts are relevant to decision-making in general.
This book will focus more on the concepts than mathematics, using statistics at times to illustrate the principles, but primarily focused on psychology.
A quick outline of the chapters follows.
Then the promise: the author intends to help the reader make better decisions.