During World War II, the Allies reviewed bullet holes of damaged aircraft returning from Germany during the Second World War.
They wanted to strengthen the planes, to be able to withstand the battles even more.
What should they recommend to their superiors?
Their immediate decision was to rebuild and reinforce areas of the plane that had more bullet holes. Theoretically, this was a logical deduction as these were the most affected areas.
But Abraham Wald, a Hungarian mathematician, realised their data came from bombers that survived and that those that were shot down were not part of their sample: he saw that the red dots represented places where the plane could sustain damage yet still return home.
Wald’s brilliant thinking steered an alternative solution—and saved many lives.
The areas that needed to be reinforced were the places where there were no red dots because those were the places where the plane would not survive being hit.
This cognitive theory is called survivorship bias. And it happens when we look at the things that have survived when instead we should look at the things that don’t.
Are you making a logical error?
Where are you taking bullets and where should you focus?