Book review: Thinking, Fast and Slow

A founder of behavioural economics explains how our brains make errors.

 
CB_thinking

Photo: Anna Lisa Sang

One day in 1984, in the midst of that decade’s pinstriped, capitalist heyday, cognitive psychologist Daniel Kahneman, his professional partner Amos Tversky and a young economist colleague, Richard Thaler, paid a visit to a Wall Street firm. They’d been invited by a senior investment manager to talk about the role judgment biases could play in investing, and Kahneman, a finance-world newbie, asked the manager: Who buys a stock when you sell it?

“He answered with a wave in the vague direction of the window,” Kahneman writes, “indicating that he expected the buyer to be someone else very much like him. That was odd: What made one person buy and the other sell? What did sellers think they knew that the buyers did not?”

The question became one of many stuck in Kahneman’s remarkable brain. The answer that he and others formulated in the intervening years is that “a major industry seems to be built largely on an illusion of skill.” More than 100 million shares of a single stock can be traded in a single day, and most buyers and sellers share the same information. Yet for different reasons, they both think the stock’s current price is wrong. And, of course, most of them are mistaken.

Kahneman’s new book condenses a lifetime’s worth of thinking about thinking. Thinking, Fast and Slow is effectively a summary of a career that broke new ground in the investigation of human judgment and decision-making; that, alongside Thaler’s, helped found the field of behavioural economics; and that was recognized in 2002 with the Nobel Memorial Prize in Economics (its recipient insisted he’d never taken an economics course in his life). More than just a summary of Kahneman’s own work, however, it draws on the work of colleagues whose ideas have cross-pollinated with his own.

So it’s got “Big, Important Book” written all over it, which can seem daunting. Bless him, then, for being an uncommonly clear and accessible writer among academics. That isn’t to say Thinking is a cakewalk. The central tension in the book is between the two modes of thinking in the human brain, which Kahneman effectively treats as characters in a story. And where a poppier writer might have named them something like “Jack” and “Jill,” here they’re just System 1 and System 2.

But their unique characteristics very quickly become clear. System 1 is the system of thought that’s instantaneous and effectively automated; System 2 channels energy into the more cognitively taxing mental activities, like concentrating and making choices. However, Kahneman writes, “although System 2 believes itself to be where the action is, the automatic System 1 is the hero of the book.”

Or, in some ways, its antihero—because it’s the ways in which System 1 works with, or outright trumps, System 2 that provide some of our quirkiest decisions and most remarkable errors in judgment. That isn’t to say that we’d be better off without it. While it’s responsible for much of what we do wrong, Kahneman reminds us that System 1 “is also the origin of most of what we do right—which is most of what we do.” System 1 knows how to read. It knows instantly that the capital of France is Paris, and that when somebody smiles at you, you should smile back at them. But a more nuanced understanding of how Systems 1 and 2 interact has been the object of decades of research by Kahneman and his peers, and their revelations account for the loss of credibility experienced by the old rational-actor theories, which until relatively recently underpinned much of economic thought. Lots of what gets us into trouble is that System 1 will react with as much (or as little) information as it has available, pushing us to reach faulty decisions. But it’s not the sole culprit.

Kahneman unpacks the notion of framing, for example, where “different ways of presenting the same information often evoke different emotions.” Take organ donation: a comparison of organ donation rates in European countries—even in neighbouring, culturally similar ones—shows dramatic variance. It’s nearly 100% in Austria, for example, but only 12% in Germany.  

That’s a result of a framing effect. High-donation countries like Austria follow an opt-out format; if you don’t want to be an organ donor, you have to tick a box. The low-donation countries feature an opt-in model, whereby you have to check a box to have your liver harvested by medical science when you die. You’d assume that our susceptibility to framing effects like this one are the result of our automatic System 1 jumping to conclusions—and for the most part, you’d be right­. But in fact, the organ-donation rates skew as a result of the laziness of System 2. As Kahneman writes, people will check the box if they already have an opinion on the subject. “If they are unprepared for the question, they have to make the effort of thinking whether they want to check the box.” And so, “an important choice is controlled by an utterly inconsequential feature of the situation.” This, as Kahneman points out, is embarrassing.

But it’s also our nature, laziness and all, and this book represents one of the best efforts yet to understand that nature. Though he’s no economist, it’s easy to see why Kahneman is treated with such respect in the field.

Because, of course, our reactions to those inconsequential features can have staggering real-world results. Think back to the proto-Gekko that Kahneman et al. met in 1984, as susceptible to the illusions of skill and validity as the counterpart he was dismissing with a wave. Because stock pickers think they exercise high-level skills, because of their confidence in their training and mastery of the information at hand, and because of the reassurance of the professional culture surrounding them, they often don’t recognize that their judgment about a stock can be coloured by any number of biases produced by the clash of Systems 1 and 2. That’s why stock picking has been shown to produce results more akin to playing dice than poker—the men in pinstripes don’t understand the ways in which they’re fallible. Kahneman’s book may be the best way to start to remedy that.

Comments are closed.