More Options

Looking for the Gorillas in Our Midst

Curiouser and Curiouser

Kylie Sturgess

September 24, 2010

An Interview with Professor Daniel Simons and Christopher Chabris

Professor Daniel Simons and Christopher Chabris have released The Invisible Gorilla and Other Ways Our Intuitions Deceive Us, a book about their ten years of research on why people succumb to everyday illusions and what they can do to inoculate themselves against the effects of these illusions.

The book details Chabris and Simons’s original experiments and how their results might explain why companies spend billions on products they know will fail, why award-winning movies are full of editing mistakes, and other behaviors caused by everyday illusions. They combine the work of other researchers with their own findings on attention, perception, memory, and reasoning to reveal how faulty intuitions often get us into trouble.

Kylie Sturgess: What are the origins of the experiment [in your 1999 article in Perception], Gorillas in Our Midst: Sustained Inattentional Blindness for Dynamic Events? It’s been around for nearly a decade now; it must have been quite a journey.

Daniel Simons: It has been! When we first did the study, it was originally intended to be a replication and an extension of a much earlier study [from the 1970s] by our colleague Ulrich Neisser. It was just a study of attention and perception; since then it’s kind of taken on a life of its own that we never anticipated. So, the original study was one that we did about a dozen years ago—we were both at Harvard—[for] a class I was a teaching assistant for, an undergraduate math class [in which] we decided to do a class project and create the study.

The study was fairly simple; we had people watch a video in which people were watching basketballs. And all you had to do was … count how many times the players wearing white passed the ball. We made it harder because sometimes they faked passes and they moved around. There were also three people wearing black shirts passing their own ball, and you were just supposed to count the passes by the players wearing white, so it’s a selective attention task.

About twenty or thirty seconds into the task, we had a person in a full-body gorilla suit walk into the scene, get to the middle of the scene, turn and face the camera, thump her chest, and walk off to the other side with a total [appearance time] of about nine seconds. What we find is that half the people trying to count the passes by the players wearing white—fully half of them—don’t see the gorilla at all and are shocked when we show [the footage] to them [again].

Sturgess: The move from the laboratory to real-world applications is the focus of The Invisible Gorilla. What have been some of the important findings that have resulted from this?

Simons: Well, I think that the starting point of the book is the gorilla experiment. The broader theme that we’re addressing is the many ways in which our intuition about our minds is wrong. So the thing that is interesting about the gorilla study is not just that people fail to see a person in a gorilla suit, which although interesting is not the really critical thing! The really critical thing is that people fully expect that they would [see the gorilla]—and that’s because we have a mistaken idea about the way our mind works. We have what we call the “illusion of attention.” We think we notice and will notice anything important that will happen right in front of us. But the reality is that we’re aware of a far smaller subset of our world than we think. That illusion, as we’ve discussed it more and more over the years, applies to a lot of our cognitive processes, a lot about how our mind works. It’s not just about attention and perception; it’s about how we think and reason and remember and anticipate. So, the book is much more broadly about how we think about our own minds in general, not just about perception and attention.

Sturgess: I was particularly impressed by Chapter Five in The Invisible Gorilla, where you looked at vaccinations and the vaccination myths that are out there. What led you to tackle this issue? … I was really surprised to find it [in the book].

Christopher Chabris: I’ll start with that. In the book we sort of progress from perception—we talk about what we pay attention to and what we notice—into memory. And then from memory, [we progress] into questions of confidence, which is sort of a natural connection. We have different levels of confidence in our memories; we’re certain of our memories when they’re really not accurate. And from there, we sort of moved into the direction of “well, what about our confidence about what we know, and where do those beliefs come from?” The illusion that you’re talking about here is called “the illusion of cause.” It’s the illusion that we understand what causes what in the world. In fact, we’re drawing that conclusion from evidence that doesn’t support that conclusion, that something is causing another thing.

For example: if two things only happen to occur together, there’s an association between them: “Kids who watch more television are more aggressive.” Well, it’s natural to assume that watching more television made them more aggressive. But it could equally go the other way: it could be that kids who are more aggressive are more likely to watch television. Or there could be some third factor that causes both of them: maybe their parents don’t pay much attention to them, so the kids watch TV and go out and are aggressive; you could create an infinite number of explanations. But our minds for some reason like to jump to the causal conclusion. That one must be causing the second one.

In fact, it can be even easier [than that]. We don’t even need the kind of statistical data like TV and aggression! We just need one good story, for example, one’s personal experience [of] having their child … vaccinated for measles, for example (or any one of the childhood vaccinations, but measles tends to be focused on) and then [the child] be[ing] diagnosed with autism, unfortunately, shortly after that. Those two things might be connected closely in time (say the vaccination preceded the diagnosis) and—this is crucially important—there has to be some sort of plausible reason or some belief that would let you connect those two things. So, “Vaccination involves injecting strange, foreign subjects into the body, and autism is a noticeable change in behavior.”

So you can see how those things might inform the belief that one causes the other. And being told later that people have done studies, with hundreds of thousands of subjects, finding that there is no difference in the rate of autism between people who have been vaccinated and people who haven’t been vaccinated … can have very little effect on the causal belief you’ve formed, based on the vivid story of personal experience. And that’s the illusion we’re talking about in that chapter, this illusion of cause—believing that you’ve detected that one thing causes another when in fact all you’ve got is that one thing came before another one or that one thing happens in the presence of the other, but there’s no causal relationship.

Simons: Let me just amplify one thing about this: we’re not saying that people who do this are in any way dumber or less educated or less intelligent...

Sturgess: No...

Simons: ... in comparison to anyone else. This is just what all of us do naturally; we reason based on anecdotes and stories and examples. And it’s very hard to reason based on statistics; it’s not something we do naturally, any of us—even scientists. So, if you look at the people who are the proponents of this link between vaccines and autism, at least in the United States, on average they are better educated than the general public, and they’re higher income [earners] than the general public. So, these are not people who are lacking in education or resources. It’s something we do very naturally because we accumulate anecdotes really well, and in many ways it’s a lot like accumulating this belief that you’ll always notice unexpected things because you’re only aware of the [situations in which] you noticed [them].

In the case of vaccines and autism, the only cases that get reported are the ones where people think there is a link. The anecdotes are: “My kid got vaccinated and now [he has] autism.” It doesn’t get reported: “My kid got vaccinated and he didn’t get autism!” because that’s not newsworthy. So, we don’t accumulate those numbers we would need in our minds, those anecdotes in our mind that would give us the right pattern. Instead, we accumulate the ones that get publicized, and that leads to the wrong pattern. It happens very naturally: we accumulate anecdotes, and when we do that, we give a lot more weight to them than we probably should.

The book The Invisible Gorilla and Other Ways Our Intuitions Deceive Us (http://www.theinvisiblegorilla.com) is available world wide.

Kylie Sturgess

Kylie Sturgess is the host of the Token Skeptic podcast and regularly writes editorial for numerous publications and the Token Skeptic blog. She was the co-host for the Global Atheist Convention in 2010 and 2012. An award-winning Philosophy teacher, Kylie has lectured on teaching critical thinking and anomalistic beliefs worldwide. In 2011 she was presented with the Secular Student Alliance Best Individual Activist Award and presented at the World Skeptics Congress 2012.