Skip to content
Survivorship Bias: The Hidden Danger of Analyzing Only Successes

Photo via Pexels

Discovery

Curated by Surfaced Editorial·History·2 min read
Share:

Statisticians and historians recognize Survivorship Bias, a logical error where one focuses only on 'surviving' data points, overlooking those that failed or were eliminated, leading to distorted conclusions. A classic illustration comes from World War II, where statistician Abraham Wald advised the US military to reinforce areas *not* damaged on returning aircraft. Initial analysis focused on bullet holes in returning planes (e.g., 20% in wings, 15% in tail), suggesting these areas needed reinforcement. Wald correctly deduced that planes hit in the engine or cockpit didn't return, meaning those areas needed strengthening, not the visibly damaged parts. This bias highlights how missing data can drastically alter our understanding of phenomena.

Why It’s Fascinating

Experts find survivorship bias intriguing because it illustrates how easily human perception can be misled by incomplete information, leading to costly and counterproductive decisions. It overturns the intuitive approach of fixing what's visibly broken, instead pushing us to consider what isn't visible. Within 5-10 years, understanding this bias is crucial for fields like startup investing (avoiding imitation of failed companies' early 'successes'), medical research (interpreting drug trial results), and product design (learning from product failures, not just successes). It's like trying to understand why some fish escape a net by only studying the fish that got caught – you're missing the crucial information about the ones that slipped through. Engineers, investors, and product developers benefit most by learning from both successes and failures. What crucial lessons are we missing by only observing what endures?

Enjoyed this? Get five picks like this every morning.

Free daily newsletter — zero spam, unsubscribe anytime.