Skip to content
 

Book Excerpt: <i>Nudge: Improving Decisions About Health, Wealth, and Happiness</i>

Read this book excerpt from "Nudge" by Richard H. Thaler and Cass R. Sunstein.

Excerpted from Nudge: Improving Decisions About Health, Wealth, and Happiness by Richard H. Thaler and Cass R. Sunstein (Yale University Press). Copyright © 2008 by Richard H. Thaler and Cass R. Sunstein.

The workings of the human brain are more than a bit befuddling. How can we be so ingenious at some tasks and so clueless at others? Beethoven wrote his incredible ninth symphony while he was deaf, but we would not be at all surprised if we learned that he often misplaced his house keys.

How can people be simultaneously so smart and so dumb? Many psychologists and neuroscientists have been converging on a description of the brain’s functioning that helps us makes sense of these seeming contradictions. The approach involves a distinction between two kinds of thinking, one that is intuitive and automatic, and another that is reflective and rational. We will call the first the Automatic System and the second the Reflective System.

The Automatic System is rapid and is or feels instinctive, and it does not involve what we usually associate with the word thinking. When you duck because a ball is thrown at you unexpectedly, or get nervous when your airplane hits turbulence, or smile when you see a cute puppy, you are using your Automatic System. The Reflective System is more deliberate and self-conscious. We use the Reflective System when we are asked, “How much is 411 times 37?” Most people are also likely to use the Reflective System when deciding which route to take for a trip and whether to go to law school or business school.

Suppose that you are suffering from serious heart disease and that your doctor proposes a grueling operation. You’re understandably curious about the odds. The doctor says, “Of 100 patients who have this operation, 90 are alive after five years.” What will you do? If we fill in the facts in a certain way, the doctor’s statement will be pretty comforting, and you’ll probably have the operation.

But suppose the doctor frames his answer in a somewhat different way. Suppose he says, “Of 100 patients who have this operation, 10 are dead after five years.” If you’re like most people, the doctor’s statement will sound pretty alarming, and you might not have the operation. The Automatic System thinks: “A significant number of people are dead, and I might be one of them!” In numerous experiments, people react very differently to the information that “90 of 100 are alive” than to the information that “10 of 100 are dead”—even though the content of the two statements is exactly the same. Even experts are subject to framing effects. When doctors are told that “90 of 100 are alive,” they are more likely to recommend the operation than if told that “10 of 100 are dead.”

Framing matters in many domains. When credit cards started to become popular forms of payment in the 1970s, some retail merchants wanted to charge different prices to their cash and credit card customers. (Credit card companies typically charge retailers 1 percent of each sale.) To prevent this, credit card companies adopted rules that forbade their retailers from charging different prices to cash and credit customers. However, when a bill was introduced in Congress to outlaw such rules, the credit card lobby turned its attention to language. Its preference was that if a company charged different prices to cash and credit customers, the credit price should be considered the “normal” (default) price and the cash price a discount—rather than the alternative of making the cash price the usual price and charging a surcharge to credit card customers.

The credit card companies had a good intuitive understanding of what psychologists would come to call “framing.” The idea is that choices depend, in part, on the way in which problems are stated. The point matters a great deal for public policy. Energy conservation is now receiving a lot of attention, so consider the following information campaigns: (a) If you use energy conservation methods, you will save $350 per year; (b) If you do not use energy conservation methods, you will lose $350 per year. It turns out that information campaign (b) framed in terms of losses, is far more effective than information campaign (a). If the government wants to encourage energy conservation, option (b) is a stronger nudge.

Framing works because people tend to be somewhat mindless, passive decision makers. Their Reflective System does not do the work that would be required to check and see whether reframing the questions would produce a different answer. One reason they don’t do this is that they wouldn’t know what to make of the contradiction. This implies that frames are powerful nudges, and must be selected with caution.