Sifting Through The Ashes of The 3 Biggest Nuclear Disasters
Last week I described the economic barriers to nuclear power as a solution to our future energy security. Today I’ll examine the big three nuclear disasters – Three Mile Island, Chernobyl, and Fukushima – and what they can tell us about this technology.
Nuclear power plants are based on the fundamental principle that when atoms are split apart, a tiny portion of matter is turned into energy according to Einstein’s famous E = mc2. These so-called fission reactions produce heat in the core of nuclear reactor. The heat turns liquid water into steam, the steam drives a turbine, and the turbine produces electricity. The concept is simple enough.
However, nuclear power plants are anything but simple. They are a complex, intricate, interconnected web of systems designed to produce power while ensuring that radioactive materials are not released into the environment. The problem is that the people who design the plant aren’t the people who run the plant. The designers don’t necessarily build the plant so that it is easy to run, and the operators don’t necessary run the plant in the way that the designers intended.
The Three Mile Island accident in March of 1979 was not caused by one of the plant’s systems failing catastrophically. It was a cascading series of unrelated failures, including a temperature indicator positioned where operators couldn’t see it, a relief valve that failed to close again after emergency venting, an indicator that led the operators to believe that the valve was closed when it really wasn’t, auxiliary pumps that had their supply disconnected and so were pumping air instead of water, and operators that succumbed to a group mentality which prevented them from truly understanding what was happening as the disaster unfolded.
Any one of these failures by itself would have been innocuous. However, the system as a whole was incredibly complex – too much so for any one operator to understand it in its entirety. What is more, there was little room in the plant design for a problem to occur without having knock-on effects. Both of these factors are chronicled in disturbing detail in Charles Perrow’s Normal Accidents. Successive layers of protection fell like dominoes, the reactor melted down, and radioactive material was released into the environment.
There were no fatalities as a direct result of the accident, and estimates of the broader effects of radioactivity on public health and property are varied and disputed. The cleanup lasted twelve years and cost $1 billion, and TMI remains the worst civilian nuclear accident in the history of the United States. National enthusiasm for nuclear power fizzled. The growth in new plant construction faded, with dozens of new plant projects cancelled.
Elsewhere in the world, the severity of Three Mile Island was discounted. Construction of new reactors continued. It wasn’t until the Chernobyl disaster of 1986 that the global nuclear industry reached a turning point.
The difference in knowledge between those that designed the plant and those that operated it was even more pronounced in Chernobyl than TMI. The reactor design made it very unstable at low power levels, but plant operators did not understand this. They embarked on an experiment intended to help improve the safety of the facility, by reducing the time that critical cooling water pumps would be without power in the event of a grid outage.
The experiment produced an explosion that destroyed the reactor and set fire to neighbouring buildings. 31 people died immediately or in the days following as a direct result of the explosion and radiation. A radioactive plume of smoke rose into the atmosphere and drifted far and wide, causing an estimated number of premature cancer deaths ranging from 30,000 to a staggering 985,000.
Initially, human error was blamed for the disaster. Over time, it became clear that the design of the reactor was fundamentally unsafe, from the way the graphite moderating rods functioned to the lack of secondary containment. Despite this, at least 11 reactors of the same design were still in operation in Russia as of 2010.
The Chernobyl reactor was designed at the height of the cold war, in a nation obsessed with economic superiority regardless of the human cost. A modern reactor should be safer. Shouldn’t it?
At first blush, the Fukushima disaster of 2011 was caused by a one-two punch of natural disasters. The plant shut down in response to an earthquake, but the resulting tsunami knocked out power to the cooling water pumps – the exact risk that the Chernobyl experiment was intended to mitigate. Deprived of coolant, the reactors overheated and melted down.
Designers make mistakes. They assume that failures will be isolated rather than cascading. They assume operators will run the plant within the specified parameters. They fail to imagine scenarios that take into account all the possible eventualities. These are all problems that can be overcome, if we only have the humility to learn from our mistakes.
However, there is one limitation that designers cannot avoid, no matter how adept they might be. The entirety of modern records on extreme weather events and natural disasters covers only the tiniest sliver of the planet’s history. These records become the basis for designs. An earthquake of a particular magnitude, based on available data, may be expected to occur once in fifty years. A more intense quake would occur once in a hundred years. These statistics are then used to establish a so-called design event.
This is where the disciplines of statistics and economics meet. The more extreme the design event, the more costly the design required to withstand it. At some point, the decision-makers have to quantify their tolerance for risk, and give that to the designers. No plant can be disaster-proof. It simply isn’t affordable.
Each year we roll the dice. Whatever happened last year is irrelevant. You could have two 100-year earthquakes one after the other. The probability is low, but it is not zero.
The challenge with nuclear power is not that it has killed a lot of people. It hasn’t, at least not in the direct cause-and-effect way that, say, coal plants have through respiratory ailments. The challenge with nuclear power is that one single event can make headlines and evoke fears – reasonable or not – of mushroom clouds on the horizon, of hair and fingernails falling out as radiation sickness takes hold, of children with horrific birth defects, of vague and varied estimates of deaths due to cancer, of mutated animals and insects, of nuclear waste lying deadly and festering for thousands of years.
As long as humans design the plants, plant designs will have flaws. As long as humans run the plants, plants will be operated incorrectly. And as long as humans are dependent on statistical data covering less than one ten millionth of the planet’s history to decide what is safe and what is not, nature will continue to surprise us in the most unpleasant of ways.
Alex Chapman is a self-titled Renewable Energy Evangelist. Renewable energy such as that from sunlight, wind, and biomass is clean, plentiful, and free. At present, these are a niche players in the global energy marketplace. In the coming years, they will become the predominant way we power our economy. Alex's blog, "Brighter Tomorrow", narrates the voyage of renewable energy from yesterday's first blush of dawn to tomorrow's warm, bright and powerful high noon.
Search 26k+ Solar Articles
- In Focus: Vertical Farming
- Why Green Buildings Work
- FUTUREWATCH: 3D NAND Flash Memory
- STUDY: Humans Caused Climate Change
- Nickel Iron Battery
- The Solar Vineyard House by Michael Jantzen
- Report: Solar PV Market to Recover by 2015
- Green Design Standards and the Construction Industry
- Converting Waste Heat Into Electricity Through Osmosis
- The Solar Canals of India
- In Focus: Sustainable Base
- New CPV Efficiency Record for Amonix