More Options

What’s Going On in Our Minds?

Book Review

Paul Brown

Volume 36.4, July/August 2012

Thinking, Fast and Slow. By Daniel Kahneman. Farrar, Straus and Giroux, New York, 2011. ISBN: 978-0-374-27563-1. 499 pp. Hardcover, $30.

Quick as you can, read the following sentences and answer the question. Observe your mind as it fumbles to its conclusion.

Kim buys one bat, and one ball. Kim spends $1.10. Kim’s one bat cost $1 more than Kim’s one ball. How much did Kim spend on the bat?

An apparently instinctive human reaction drives many people to blurt out the (wrong) answer, “One dollar!” Others pause before speaking, think for a moment, probably frown, break eye contact, and eventually respond (correctly), “One dollar . . . and five cents.” Professor Daniel Kahneman be­longs to a third group of people who start thinking just as the rest of us stop. They pay attention to the mind at work and ask themselves, “Hello. What’s going on here, then?” Kahneman’s book Thinking, Fast and Slow is instructive to rationalists and skeptics because of what it tells us about the nature of human error.

Thinking, Fast and Slow book cover

The little gedankenexperiment above is just one of a great many that anchor the ideas of Kahneman’s book to the empirical bedrock. Much modern psychology—or cognitive science if you prefer the fashionable term—relies on experiment to pick apart our mental machinery. Re­searchers assemble a group of people in a room to ask them a list of cunningly designed questions. They time their subjects’ responses and compare their answers with what is “reasonable” or “rational.” What we’ve learned from all this science is that while human beings don’t all respond in precisely the same way, as a species we exhibit quite a few counterintuitive quirks and habitual, or instinctive, cognitive biases. In other words, our answers are wrong in systematic and predictable ways.

How does all of this work? Kahne­man offers the following explanation: Brains evolved to be pattern-seeking machines. Biologically speaking, our brain’s primary function is to make sense of the overwhelming mess of sensory stimuli to which our environment ex­poses us. In Kahneman’s telling of it, brains aren’t passive receptors. Rather, our grey matter is constantly trying to construct a coherent model of the world “out there” in order to guide our reactions to it.

But there’s a problem. A brain is a terribly expensive organ to run. Pri­mate brains consume 20 to 25 percent of the body’s energy budget. So what’s a poor organ to do? The answer appears to be that our brains use mental shortcuts (heuristics) that allow for massive reductions in brain-power at the cost of the occasional error. If you think about it, even with these shortcuts our brains are pretty astounding. Stand up, walk to the bathroom, open the door, adjust the shower heat, apply soap—consider the sea of unconsidered action upon which our relatively tiny consciousness floats. Mental heuristics are many and mostly trivial, but they can be tremendously potent. Mental shortcuts are what “tell” an experienced surgeon when his patient is about to hemorrhage, or a fire captain to pull his crew from a building just before it collapses, or a cook when the salmon is grilled to perfection. Sophisticated, learned responses often cannot be rationally justified. They just “feel right.”

Kahneman refers to this large collection of heuristics as “System 1.” System 1 “operates automatically and quickly, with little or no effort and no sense of voluntary control.” In short, System 1 is a sense-making machine, and we rely on it almost exclusively to get through our days. Alas, System 1 isn’t perfect. Error is built into its design. It is far, far better that our minds “detect” or “construct” a nonexistent snake a hundred times than overlook a real snake just once.

Now, as we saw in the example of Kim’s bat and ball, there is also a second mode of mental operation. Kahneman refers to it as “System 2,” but you’ll also see it elsewhere referred to as “executive function” or “cognitive control.” It’s the part of our brain that steps in from time to time, overriding System 1. “System 2 allocates attention to the effortful mental activities that demand it, including complex computations,” Kahneman writes. “The operations of System 2 are often associated with the subjective ex­perience of agency, choice, and concentration.” System 1 thinks fast; System 2 thinks slowly.

System 1’s heuristics are fine as re­sponses to our everyday environment, but they are poor guides for reasoning about subjects expressed in terms of quantities, time periods, or probabilities—all more properly the province of System 2. Thinking, Fast and Slow is in part a catalog of the ways we slouch into error: circumstances where we opt for a quick, low-cost, and coherent worldview over mental hard labor. Kahneman shared the 2002 Nobel Prize in Economic Sciences for the application of these psychological in­sights to the way human beings make economic decisions and, while not strictly within the scope of a review with skeptical and rationalist readers in mind, this book’s critique of the shambles and shenanigans that characterize financial markets is pretty withering.

Skeptics will find that the ideas in Kahneman’s book arm them with both sword and shield. If you’re ever exasperated at the way some human beings manage to remain sublimely indifferent to evidence, Kahneman offers an explanation that plea-bargains any charge of malevolence down to mere laziness. Mental work is hard, and monkeys don’t like it. Thinking about abstractions such as numbers or general laws of nature reduces the amount of energy and attention we can invest in more practical and immediate problems like gathering fruit, not standing in fire, avoiding bad meat, or figuring out when another monkey is up for a cuddle. If your worldview includes superstitious or irrational beliefs, your System 1 will go to extraordinary lengths to weave your experiences to­gether to tell you a coherent story that is consistent with those beliefs—and since on a day-to-day basis there’s little penalty for believing in things that don’t exist, why change?

But the book also reinforces the idea that skeptics are human beings too, and human beings are never more prone to error than when we are overconfident. Hubris yields error, and Kahneman and his colleagues have famously shown how few human specimens display more hubris than “experts.” It’s apparent that even when we know we’re in error, even when we’ve been alerted to the nature of the mistake, even after we’ve made the mistake and it’s been pointed out to us and explained to us, it takes tremendous effort on our part to rewire our brains.

Kahneman makes that point with another of his experiments. Based on the following thumbnail biography, which of the two subsequent statements is most probably true?

“Wendy is young, of average ap­pearance, and socially awkward. From a young age she excelled at school and went to an elite university, completing her doctorate. She is married with a daughter.”

1. Wendy works as a librarian.

2. Wendy works as a librarian and is an active feminist.

Confronted with a version of this puzzle, no less a rationalist and skeptic than Stephen J. Gould wrote, “A little homunculus in my head continues to jump up and down, shouting at me.” For those of you puzzled by the problem, it helps to retell the story in stark, mathematical terms. Given any two probabilistic propositions, A and B, which is more likely? A? Or A and B? If the mathematics of probability means anything, it means that “Wendy works as a librarian” must be more “probably true” than “Wendy works as a librarian and is an active feminist.” Every librarian who is an active feminist is still a librarian, and there are surely librarians who are not feminists! But our lazy brains prefer the more detailed, coherent, and more plausible story to the energy-intensive work needed to arrive at the truth. Kahneman provides several examples of large-scale mistakes that required entire communities of highly trained professionals—himself included.

For readers interested in personal error—both understanding and avoiding it—the book isn’t especially comforting. Kahneman explains that hu­man beings are terrible at perceiving their own errors and worse at learning from them. Consciously changing our own beliefs by considering the evidence is ap­parently terribly hard—never mind the beliefs of others. So what’s a rationalist to do?

Well it turns out that for all our failings at self-regulation, we’re actually pretty decent at spotting mistakes made by other people. Anyone who has spent any time in a collaborative work environment will instantly recognize the phenomenon. It might never occur to me to examine each step along the path from insight to conclusion, but fortunately other people can typically be relied upon to tell me I’m wrong and why. Rationalists can take some comfort in the thought that the institutions of scientific practice and peer review exploit this aspect of human nature.

What about changing the minds of others? Here, Kahneman’s view is at first glance rather bleak, but it offers a curious kind of hope. If you ask people, “Do you believe X?” they will answer either “Yes” or “No.” If you ask them, “Have you always believed X?” they will typically respond that they’ve never changed their minds. Enquire of their opinions on a scientific or political controversy and they’ll explain them to you. Then supply new information (pro or con), repeat the questions, and what you find is that Lo! their minds have changed . . . hardly at all.

However, if you ask people at different times what they think, you will find that their minds have in fact changed. Psychologists have noticed that evidence not only changes our current beliefs but also our memory of what our beliefs were before we acquired the new information. Gradual evolution of public sentiment is apparent in survey re­sponses to questions about religious be­liefs, the existence of satanic cults in daycare centers, or the wisdom of the Raiders’ second round draft choices. Minds, it seems, aren’t changed abruptly through reason but gradually as the heuristics by which our minds construct their coherent worldviews adjust. And we typically fail to perceive how much change we’ve undergone.

Kahneman uses this System 1/System 2 framework to examine questions about how people arrive at decisions with uncertain information, how we think about risk, and how we construct our lives through a combination of experience and memory. Through­out the book readers will find memorable, pithy pronouncements such as, “Language implies that the world is more knowable than it is” or “Nothing in life is as important as you think it is when you are thinking about it” or (quoting another psychologist) “Sub­ject’s unwillingness to deduce the particular from the general was matched only by their willingness to infer the general from the particular.”

Anyone who has followed recent public policy debates about behavioral finance or prospect theory will find little in the book that goes beyond recapitulating the greatest hits of one man’s very productive career as a working re­search psychologist. Kahneman also stays mute on the kinds of big questions skeptics would like answered, such as why people persist in irrational and even self-destructive beliefs. He prefers to report the evidence and provide softer, more personal advice on how best to grapple with our own biases and guard against errors when pricing bats and balls.

For readers interested in deepening their understanding of what is going in their own minds (and how to guard against their own errors), it’s hard to pass up a book with such a combination of persuasive power and pedigree.

Paul Brown

Paul Brown is a computer scientist who specializes in building data management applications for very large scale science projects. Although he has published extensively in perhaps the world’s most obscure professional research journals, this review is his first foray into popular writing.