I have been thinking about errors, mistakes and failures ever since I traded my first stock decades ago. Good traders expect to be wrong, but that attitude is surprisingly rare in business. That is a shame, because having a healthy outlook on failure would benefit corporations, governments — just about everyone.

We shall dispense with the usual tired tales — yes, we all know that New England Patriots quarterback Tom Brady was picked in the sixth round of the National Football League draft, and that all-time basketball great Michael Jordan didn’t make his high school varsity team. Instead, let’s consider how we can better incorporate data into our processes, open versus closed approaches, and how we can learn to fail better.

If data is involved, then survivorship bias is not far behind. My favorite example involves Abraham Wald, a mathematician at Columbia University. Wald was a member of War Department’s Statistical Research Group during World War II.   In “How Not to Be Wrong: The Power of Mathematical Thinking,” Jordan Ellenberg describes how Wald addressed the challenge of armoring bombers so they could survive the fearsome attacks of fighter planes and anti-aircraft fire. The Center for Naval Analyses had performed a study showing the damage patterns of returning aircraft. Its recommendation was to add armor to those areas that showed the most damage: on the plane’s wings, fuselage and tail. Wald rejected that, noting if a plane could return with its wings shot up, that was not where armor was needed. Instead, he advised considering the larger data set of all planes, especially the ones that did not return. “The armor doesn’t go where the bullet holes are. It goes where the bullet holes aren’t,” he explained. “On the engines.”

High stakes make aviation an excellent subject for the study of failure. In other fields, errors may be subtle, and the results not recognized for years. When there is a flying failure, planes fall out of the sky, and footage of the wreckage is on the evening news.

Matthew Syed points this out in “Black Box Thinking: Why Most People Never Learn From Their Mistakes (But Some Do).” Aviation is an open, data-rich system, with statistics going back a century: In 1912, the U.S. Army had 14 pilots, and even before the war, more than half (eight) would die in crashes.  The Army (this was before the Air Force was established) set up an aviation school, to teach pilots how to fly more successfully. Unfortunately, the school had a 25 percent mortality rate.

Fast-forward a century. Syed observed that in 2013, there were 36.4 million commercial flights worldwide carrying 3 billion passengers. That year, there were only 210 commercial aviation fatalities. For some context, 1 million flights resulted in 0.41 accidents. An average of 2.4 million flights were needed for a single accident. Last year (2017), zero commercial airline passengers died. That is an astounding improvement over the course of a century.

How did the industry achieve this? By being self-critical and learning from accidents. Every accident, each crash (or near miss) gets studied extensively. The Federal Aviation Administration requires all large commercial aircraft to have a cockpit voice recorder and a flight data recorder to create a comprehensive and objective data set to allow for the full study of failure. Even the famed black boxes themselves are subject to exhaustive review and improvement. Today, these boxes are orange — making them much easier to spot in difficult terrain or underwater — and have submersible locator beacons to aid in their detection and retrieval from the ocean. It’s the perfect metaphor for how self-critical the industry is about safety.

Compare this with a closed system, like health care and hospitals. That industry has a very different approach, with vastly inferior results.

How different? Syed notes the remarkable contrast between air travel and preventable medical errors,   which might result in as many as a half-million deaths in the U.S. at a cost estimated at $17 billion a year. After heart disease and cancer, medical errors are the No. 3 cause of death in America. Peter Pronovost, clinician at John Hopkins Medical School, wondered how we would respond if each day two 747 jumbo jets fell out of the sky killing roughly 900 people. That’s how many people die daily from medical errors.

Why is health care so different from aviation? First, there is little publicly available data and no sort of standardized review process when errors occur. Whatever self-examination takes place is private and is sealed and not readily available for public scrutiny. There is an attitude among some that doctors are infallible saviors, creating a reluctance to admit error. Insurance costs, litigation and protecting reputations reduce the desire for a public accounting. In short, health care is everything that aviation is not.

First « 1 2 » Next