top of page

Normal Nuclear Disasters and Abnormal Startups

  • Writer: Alex Shilman
    Alex Shilman
  • Jul 13
  • 4 min read

On the night of April 26, 1986, when a chain of unfortunate events led to a massive explosion in the core of Reactor No. 4 at the Chernobyl nuclear power plant, the book Normal Accidents by American sociologist Charles Perrow had already been out for two years. The book was, in effect, the “writing on the wall” predicting the disaster—only the wall happened to be on the wrong side of the Iron Curtain.


First responders
First responders

In the restrained panic that gripped the upper echelons of the Communist Party, and the frantic activism of the middle ranks—who were forced to sacrifice their own lives and those of their subordinates in an effort to minimize the disaster—there was no time for real soul-searching. And when the moment arrived to draw conclusions, the Party mostly focused on finding scapegoats. HBO’s new miniseries effectively conveys both the terrible banality of the catastrophe and the extraordinary complexity of its circumstances.


Ironically, had Perrow’s claims been known in the USSR, they might have offered a measure of consolation to those responsible—after all, the disaster was arguably inevitable. One might even say that the fate of the Soviet Union itself was foreshadowed by the ill-fated reactor. Years later, Mikhail Gorbachev—then Secretary General of the Communist Party and effectively at the top of the Soviet pyramid—would write that Chernobyl was probably the real reason for the Soviet Union’s collapse.


Charles Perrow built his career on studying complex man-made systems—vast industrial machines, corporations with thousands of employees, and globe-spanning military logistics. His work led him to a grim conclusion: beyond a certain threshold of complexity, any intricate system is destined to suffer from unpredictable failures.


Perrow demonstrated that modern engineering and logistical systems tend to reach such levels of complexity that it becomes impossible to anticipate every scenario or design an adequate response. Moreover, the human factor is always lagging behind technological developments—either due to the ongoing need to train operators in the latest technologies or because of the diffusion of responsibility across different teams. Either way, it becomes a mathematical certainty that one small, “normal” error will combine with another small, “normal” failure in a way no one could have foreseen—creating a unique but perfectly “normal” sequence of events that leads to a critical system failure.


So what can be done? Perrow’s advice was straightforward: if you can’t contain the worst-case scenario of a system, don’t build it. Or build a simpler system instead.



From Machines to Markets and Black Swans


Years later, Lebanese-American economist and investor Nassim Nicholas Taleb formulated a similar idea, though in market terms. Taleb described the inevitable emergence of the “Black Swan”—a disruptive event that we intuitively perceive as extremely rare, but which, due to the elusive nature of human self-organizing systems, is in fact bound to occur. Examples include the crisis of the late 1980s, the bursting of the dot-com bubble in the early 2000s, and the subprime mortgage meltdown in 2008—whose shockwaves are still felt today.


But thanks to Taleb, we also know how to protect ourselves from such systemic failures. Staying true to his belief in the inevitability of Black Swans—those extreme crises that shake entire systems—Taleb bet against the market prior to all three of the above events and amassed vast wealth as a result. His philosophy is simple: if you’re building a system meant to survive within a natural ecosystem with self-organizing properties—like the free market, or a complex engineering system interacting with the world and relying on humans for proper function—you must account for extreme crises. And the best way to prepare for such crises is to design a system that grows stronger from them.


Taleb calls this survival trait Antifragility. Antifragility is not just the ability to withstand shocks—it’s the ability to grow from them. In fact, this mirrors Taleb’s own life story. He spent his childhood during Lebanon’s brutal civil war, and his aristocratic Beirut family was forced to flee to the U.S. in poverty. But he learned to turn pain and trauma into a source of resilience and growth.



Normal Startups


The complexity of our world—and the financial and technological systems surrounding us—has only increased since Perrow’s book and Taleb’s insights. Almost every piece of consumer technology today involves hundreds of different producers. Modern knowledge economies largely rely on massive volumes of complex, tangled data—a breadcrumb trail that each of us leaves behind through even the most trivial daily actions.


Anyone involved in modern technological systems is familiar with the problem of historical layers and countless inherited dependencies. Forgive the poetic license, but at the end of every line of Java code you write, there’s likely an ancient module from Alan Turing’s era quietly puffing on a nuclear hookah.


By 2019, even the most basic systems contain a staggering number of “moving parts” that must be orchestrated together: diverse APIs, half-maintained open-source libraries, outdated core technologies, and obscure hardware components from the Far East. All of these exacerbate Perrow’s structural failure problem. And that’s without even discussing the destructive potential of targeted attacks on such vulnerabilities.


The only way to build sustainable projects within such a chaotic and failure-prone ecosystem is to consciously choose Antifragility. Use short iterations to maintain tight feedback loops. Build your product backwards—from user pain points to the technological solution, not the other way around. Create a business plan that rests on multiple independent legs—or at least one that doesn’t depend on untested assumptions and has wide risk margins.


It’s easy to see why models like Lean Startup, Agile, and Design Thinking have succeeded in recent years: they respond to the urgent need to build organic, resilient products in the face of inevitable Black Swans. So next time you start a project—check carefully who’s smoking that hookah at the end of your Gantt chart. You don’t want to end up with another Chernobyl.

 
 
 

Comments


bottom of page