Why We Miss the Signs


It&#39s the question we all want answered: Why did so many smart people miss that the subprime market was collapsing? As early as 2001, there were signs about the impending housing bubble and the rampant use of derivatives. Yet such financial players as Northern Rock, Countrywide, Bear Sterns, Lehman Brothers and Merrill Lynch largely ignored them until it was too late. Some were more prescient. In 2002, investment guru Warren Buffet derided derivatives as financial weapons of mass destruction. And a year later, he said that complex derivatives would multiply and mutate until &#34some event makes their toxicity clear.&#34

What separates the hapless from the shrewd? Did the siren call of outsize profits and bonuses, along with promises of manageable risk, make people overlook the obvious? Could they not see sooner and more clearly because they were overloaded with information?

All managers are susceptible to distortions and biases. Organizations get blindsided not so much because decision makers aren&#39t seeing the signals, but because they jump to the most convenient or plausible conclusions. Research suggests that less than 20% of global firms have the capacity to spot, interpret and act on weak signals&#8212whether they be of imminent threats or opportunities. Such peripheral indications are, by definition, muddled and imprecise. And seeing them ahead of time is more challenging than it seems&#8212certainly harder to do than it seems in hindsight. The first step toward catching sight of these signals is being aware of the biases that stymie us.

First of all, we are subject to cognitive biases: They underlie how we filter, interpret and often bolster information. They lead to us framing complex or ambiguous issues in overly simplistic ways. We don&#39t fully appreciate other possibilities and become, by extension, more confident about our particular view.

For instance, what we pay attention to is very much determined by what we expect to see. Psychologists call this selective perception. If something doesn&#39t fit, we often distort reality until it fits our mental models. Thus, we end up filtering out a lot of information.

Further, whatever information passes through our cognitive and emotional filters can become further distorted. We rationalize, interpreting evidence in ways that sustain our desired beliefs. We fall victim to this, for example, when we&#39re trying to shift the blame for a mistake we made to someone else.

We also, all of us, fall prey to what is known as the fundamental attribution error: We overemphasize our own role in events, ascribing more importance to our own actions than we do to the environment surrounding us. So, in the case of business, we might be inclined to see our role in the organization as more central to its success than it really is.Not only do we filter the limited information we pay attention to, but we also seek to bolster our ideas by searching for evidence that confirms our views. So we might talk more to people who already agree with us. Or we may actively look for information that supports our ideas, but not look for information that discounts our notions.

And it&#39s not just our personal biases: When we work in organizations, we also can become very susceptible to &#34groupthink.&#34 In principle, groups should be better than individuals at detecting and responding to changes. But if a team isn&#39t well managed, or there&#39s pressure to conform to the group&#39s ideas or even just to not rock the boat, then organizations will miss information as it comes to them. Debate is not only healthy, it&#39s necessary&#8212as is sharing information&#8212when it comes to understanding peripheral signals.

Finally, we need to remember that we don&#39t listen to all people equally: We consider not just what is being said, but who is speaking. And we regard credibility in terms of status, past experiences, politics and the like. When information is weak or incomplete, we are especially likely to fall back on social biases.