Monday, September 24, 2012

Thinking too fast

After so many books and articles and social memes about the virtues of intuition (e.g., Gladwell’s Blink, Peirce’s The Intuitive Way), it’s refreshing to find a book that takes intuition down a few notches and focuses on the many cases in which good judgment depends instead on the rational side of the brain. Daniel Kahneman’s book, Thinking Fast and Slow, does just that. It’s one of the best books I’ve ever read, chocked with fascinating and important findings, so many that I actually took notes, many pages of them, all the better to retain them.

In turns out that the intuitive part of the brain works best in highly regular, predictable, environments, or when subjected to hours of practice and immediate and reliable feedback. But elsewhere it often warps both judgment and action unless the rational brain intervenes. Here are some examples:

The Anchoring Effect: encountering a number can bias subsequent numeral judgment in the direction of that number, even when that number is unrelated to the judgment in question. The size of initial offers effects the price buyers are willing to settle on; so does the random number a spinner lands on.

The Florida Effect (an example of the Priming Effect): students who encountered more words relating indirectly to old age (wrinkled, Florida…) walked more slowly down a hallway than controls did.

The Accessibility Effect/The Focusing Illusion: We tend to exaggerate the significance of things that come more readily to mind or that we are currently focusing on—overestimating the risk of flying, for example, right after the latest well-publicized airplane crash.

Averaging/Norming and Prototyping (rather than summing up): people tend to judge that an intelligent, politically active woman is less likely to be a bank teller than “a bank teller and a feminist”; people rate themselves as less assertive after listing 12 times they were assertive than after listing 6 times they were assertive (the second 6 examples are harder to call to mind and less compelling than the first 6).

The Substitution Effect: we tend to substitute harder questions with ones with easier, more available answers. For example, when asked how someone might be to lead an organization, we might base our answer entirely on how likable and articulate we find him or her.

The Halo Effect: once we start liking someone, we tend to make various unfounded and self-reinforcing assumptions about positive traits we haven’t actually observed.

A Preference for Causal over Statistical Explanations: we tend to vastly underestimating the role of chance and ignore statistical phenomena like regression toward the mean.

Denominator Neglect/the Planning Fallacy: we fail to take into account all that can go wrong in executing a project, and neglect to ask what the statistical success rates of similar projects has been.

In general, the intuitive part of the brain seeks out consistency at the expense of recognizing unknowns and random elements, and tries to understand propositions, in Kahnemann’s words, “by making them true.” One particularly sinister result of this is that a message, unless immediately recognized as a lie, will have the same effect on the associative/intuitive brain regardless of its actual reliability.

No comments: