Organizations that take the word of overconfident experts can expect costly consequences. A Duke University study of chief financial officers showed that those who were most confident and optimistic about how the Standard & Poor's index would perform over the following year were also overconfident and optimistic about the prospects of their own companies, which went on to take more risks than others.

As Nassim Taleb, the author of The Black Swan, has argued, inadequate appreciation of the uncertainty of the environment inevitably leads economic agents to take risks they should avoid. However, optimism is highly valued; people and companies reward the providers of misleading information more than they reward truth tellers. An unbiased appreciation of uncertainty is a cornerstone of rationality -- but it isn't what organizations want. Extreme uncertainty is paralyzing under dangerous circumstances, and the admission that one is merely guessing is especially unacceptable when the stakes are high. Acting on pretended knowledge is often the preferred approach.

Medical Certainty

Overconfidence also appears to be endemic in medicine. A study of patients who died in the intensive-care unit compared autopsy results with the diagnoses that physicians had provided while the patients were still alive. Physicians also reported their confidence. The result: "Clinicians who were 'completely certain' of the diagnosis ante-mortem were wrong 40 percent of the time." Here again, experts' overconfidence is encouraged by their clients. As the researchers noted, "Generally, it is considered a weakness and a sign of vulnerability for clinicians to appear unsure."

According to Martin Seligman, the founder of positive psychology, an "optimistic explanation style" contributes to resilience by defending one's self-image. In essence, the optimistic style involves taking credit for successes but little blame for failures.

Organizations may be better able to tame optimism than individuals are. The best idea for doing so was contributed by Gary Klein, my "adversarial collaborator" who generally defends intuitive decision-making against claims of bias.

Klein's proposal, which he calls the "premortem," is simple: When the organization has almost come to an important decision but hasn't committed itself, it should gather a group of people knowledgeable about the decision to listen to a brief speech: "Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome has been a disaster. Please take 5 to 10 minutes to write a brief history of that disaster."

As a team converges on a decision, public doubts about the wisdom of the planned move are gradually suppressed and eventually come to be treated as evidence of flawed loyalty. The suppression of doubt contributes to overconfidence in a group where only supporters of the decision have a voice. The main virtue of the premortem is that it legitimizes doubts.

Furthermore, it encourages even supporters of the decision to search for possible threats not considered earlier. The premortem isn't a panacea and doesn't provide complete protection against nasty surprises, but it goes some way toward reducing the damage of plans that are subject to the biases of uncritical optimism.

Daniel Kahneman, a professor of psychology emeritus at Princeton University and professor of psychology and public affairs emeritus at Princeton's Woodrow Wilson School of Public and International Affairs, received the Nobel Memorial Prize in Economic Sciences for his work with Amos Tverksy on decision making. This is the first in a four-part series of condensed excerpts from his new book, Thinking Fast and Slow, just published by Farrar, Straus and Giroux. The opinions expressed are his own.

First « 1 2 3 » Next