Blink and You’ll Miss It

Gorilla

Image: Globe and Mail

From the Globe and Mail
Thursday, Aug. 26, 2010

I was made to feel stupid by a video on YouTube the other day. Designed by neuropsychology professors Christopher Charbris and Daniel Simons, the clip shows some basketball players passing a ball around. Count the passes, the viewer is instructed. So I did, and got the right answer: 15. But I entirely missed the person in the gorilla suit who danced across the screen halfway through.

I’m not alone. Half the people who have taken the test missed it, illustrating how we frequently overlook the bigger picture when making quick assessments. It’s a principle Charbris and Simons explore in The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us, a book that should be compulsory reading for anybody who keeps a dog-eared copy of Malcolm Gladwell’s Blink in their desk drawer. That would have included former Lehman Brothers president Joseph Gregory, who was so impressed by Gladwell’s case for “the power of thinking without thinking,” he had Gladwell address Lehman staff not long before the whole company went down the drain.

Gladwell’s concept of “rapid cognition” sure beats the drudgery of analysis when it produces optimal outcomes, but it doesn’t always work. As Gladwell acknowledges, racial stereotyping—a rapid assessment based on skin colour—is one all-too-common variety of snap judgment. Ask Denny’s or Abercrombie & Fitch—which, in recent years, have settled lawsuits relating to hiring and serving practices that have cost them a combined $100 million (U.S.)—how expensive it can become if your corporate culture is infected by that particular kind of blink decision-making.

But if racism is an obviously suboptimal outcome, there are also subtler examples of rapid cognition that are risky. Take experience, for example, the accumulated memories of what has happened to us in the past. We all consult experience when making quick judgments about the probability of future events. The problem here is the reliability of our memories. We are, as Nobel prizewinning economist Daniel Kahneman suggests, two selves in one body: an experiencing self who does the living, and a remembering self who does the recalling, forecasting and decision-making. Miscommunication between these selves is a common result of memory biases.

The first of these biases stems from what psychologists call availability. We can only use the experiences that are available for our memory to retrieve. What’s forgotten is unavailable. And some things—notably, bad outcomes, which are highly relevant—we would often rather forget. But even if our memory is good, distortion still occurs. A 1991 experiment asked students to rate their self-confidence and to recall incidents that illustrated their assessment. The fewer illustrations they were asked to recall, the higher their self-assessment.

In their book Rational Choice in An Uncertain World, Reid Hastie and Robyn Dawes describe a related effect, in which the things we think of first when making a decision tend to swamp considerations that follow. They refer to this as the “primacy effect,” where the first thing that pops into your head becomes an anchor that resists adjustment. The 20% drop in airline traffic following 9/11 was informed by this kind of thinking. For many would-be travellers, the tragedy was the first thought that came to mind, and it tended to overwhelm other considerations, such as the likelihood that air travel was indeed safer due to the increased security.

If our experience is not always available to memory, and we overweight the memories that come to us most easily, it’s also important to remember that there are biases in the accumulation of our experience. Studies have consistently shown, for example, that people tend to estimate that homicide rates are higher than suicide rates, despite the reverse being true. Hastie and Dawes link that false assessment with the fact that suicides get less press coverage. In this case, then, our experience, dominantly based on reading news accounts, accumulates in a biased way and becomes unreliable for forecasting the future.

But one of the most fascinating arguments against trusting experience comes from a 1986 study at the University of Michigan. Between 1973 and 1982, researchers sampled the political views of a group of subjects at different points. In 1982, they were asked to recall their views from 1973. Researchers found that people’s present political views strongly influenced the way they remembered their earlier political views. People update their own memories, he surmised, to align them with the present. A similar result was observed in a 1985 study of university students asked to remember what level of alcohol use they’d reported 2 1/2 years earlier. Present levels of consumption strongly influenced memories of previous usage, driving estimates downward. As George Valliant once wrote in Adaptation to Life: “Maturation makes liars of us all.”

Or, in the even more pointed words of Benjamin Franklin: “Experience is a dear school, but fools will learn at no other.” Of course, “dear” in this context means “very expensive.” Franklin is issuing a warning: Rely on experience, intuition, gut calls and Blink moments, and you’ll pay for it eventually.