Happy Petrov Day to those who celebrate. On September 26, 1983, Stanislav Petrov made the correct...

Happy Petrov Day to those who celebrate. On September 26, 1983, Stanislav Petrov made the correct decision to not trust a computer.

The early warning system at command center Serpukhov-15, loudly alerting of a nuclear attack from the United States, was of course modern and up-to-date. Stanislav Petrov was in charge, working his second shift in place of a colleague who was ill.

Many officers facing the same situation would have called their superiors to alert them of the need for a counter-attack. Especially as fellow officers were shouting at him to retaliate quickly before it was too late. Petrov did not succumb.

I've attached a short clip from a reenactment of the situation in the documentary The Man Who Saved the World.

The computer was indeed wrong about the imminent attack and Petrov likely saved the world from nuclear disaster in those impossibly stressful minutes, by daring to wait for ground confirmation. For context one must also be aware that this was at a time when US-Soviet relations were extremely tense.

I've previously written about three lessons to take away from Petrov's actions:

1. Embrace multiple perspectives

The fact that it was not Stanislov Petrov's own choice to pursue an army career speaks to me of how important it is to welcome a broad range of experiences and perspectives. Petrov received an education as an engineer rather than a military man. He knew the unpredictability of machine behavior.

2. Look for multiple confirmation points

Stanislav Petrov understood what he was looking for. While he has admitted he could not be 100% sure the attack wasn't real, there were several factors he has mentioned that played into his decision:

- He had been told a US attack would be all-out. An attack with only 5 missiles did not make sense to him.
- Ground radar failed to pick up supporting evidence of an attack, even after minutes of waiting.
- The message passed too quickly through the 30 layers of verification he himself had devised.

On top of this: The launch detection system was new (and hence he did not fully trust it).

3. Reward exposure of faulty systems

If we keep praising our tools for their excellence and efficiency it's hard to later accept their defects. When shortcomings are found, this needs to be communicated just as clearly and widely as successes. Maintaining an illusion of perfect, neutral and flawless systems will keep people from questioning the systems when the systems need to be questioned.

We need to stop punishing when failure helps us understand something that can be improved.