Professionalism/Chernobyl Disaster

From Wikibooks, open books for an open world
Jump to navigation Jump to search

Background[edit | edit source]

On the morning of April 26, 1986, an explosion and fire occurred at the Chernobyl nuclear power plant in Pripyat, Ukraine resulting in one of the worst engineering catastrophes in history. The disaster killed 31 people, and released large amounts of radiation into the air that spread far across Europe[1]. The Soviet Union spent 18 billion dollars on a clean up effort requiring over 500,000 workers[2] The failure can be attributed to poor reactor design, operator error, and organizational error. Three decades later, the nuclear fallout from Chernobyl continues to wreak havoc on the environment and human health.

Accident[edit | edit source]

The Chernobyl nuclear power station lacked a secure contingency plan in case of a power failure. Backup generators were supposed to safely shut down the reactor in case of an outage but required 60 seconds to power up. The 60 second gap was deemed unacceptable as the reactor could easily destabilize during that time frame. Engineers needed a way to power the reactor until the backup generators kicked in and theorized that the steam turbines could be used to bridge the time gap between a power outage and fully running generators[3].

Experiment[edit | edit source]

Reactor diagram

Engineers wanted to experimentally test their theory and conducted tests in 1982, 1984, and 1985, all with negative results. A fourth test was scheduled for 1986 to learn how long the steam turbines would run after a routine shutdown of reactor 4. The experiment required several conditions: the reactor had to be outputting between 700 and 800 megawatts of power, and the steam turbines had to be running at full speed. Before the test, Chernobyl operators made preparations to ensure the reactor's power output remained unaffected. Engineers disabled the emergency core cooling system, removing control rods because they had not proven useful during previous tests[3].

Once the test began, unstable conditions forced the reactor to run at an unsafe power level of 30 megawatts, only 5 percent of the power required for the experiment. As a result, engineers removed even more control rods to increase the power output. The reactor embarked on a positive feedback loop as steam bubbles formed by water coolant reduced the coolant's ability to absorb neutrons, which further increased the power output. In response to the power increase, more coolant was flushed into the core, producing more steam bubbles and the loop continued to amplify due to the removal of the control rods, which normally controlled the feedback loop[4].

Operators soon realized the rapid temperature and power increase needed to be controlled with the control rods, so they pressed the EPS-5 button to reinsert emergency control rods and shut down the reactor. The control rods were made of boron but tipped with graphite. Boron absorbed neutrons, reducing power output, but graphite dispersed them; none of the operators had received adequate training to know this. After the control rods were lowered into the reactor core, the graphite tips spiked the power output. The heat from this spike broke several of the control rods' lowering mechanism, jamming the rods. The control rods became stuck, with the graphite tips at the center of the reactor core, which caused the core temperature to rise even further. Massive steam pressure developed and eventually exploded; the last power output reading on the control panel was 33,000 megawatts[4].

Factors Leading to Disaster[edit | edit source]

Cultural Secrecy Lack of Openness[edit | edit source]

The Soviet Union lacked a rigorous, cohesive nuclear safety policy because it developed its nuclear technology during the Cold War. Chernobyl illustrates the risks of engaging with dangerous technology in times of war, competition, and isolation. Due to these factors, its nuclear technology and design were closely guarded and avoided scrutiny internationally - due to the fierce sense of competition with its western rivals, especially the United States - and internally, for raising a concern about design or safety was considered a direct criticism of the state and Communist Party, which was not allowed under Soviet totalitarianism. As this case shows, a political culture where concerns cannot be voiced prevents engineers, scientists, and government leaders from developing adequate safety policy[5].

Reactor Design and Safety Protocol[edit | edit source]

The Soviet reactor designs lacked the thorough safety features of their western counterparts; as a result, they were very unforgiving to operator mistakes[6], which were frequent to to inadequate operator training. Nuclear operators were not fully educated on the power station or process. When institutions fail to train employees operating powerful and dangerous equipment and processes, they tend to engage in riskier behavior because they are unaware of the dangers. In the case of Chernobyl, the operators did not understand that an explosion could occur during the safety test. Chernobyl's lack of proper safety transcended its operators and further manifested itself in design, construction, manufacturing, and regulation. Unlike two other plants where nuclear accidents occurred,Three Mile Island and Fukushima, Chernobyl's design did not include a containment structure in case a meltdown occurred[7]. Designs without safeguards leave very small margin for error.

Failure to Follow Chain of Command[edit | edit source]

Post hoc analysis of engineering failures reveals lower level employees often realize risks but fail to prevent disaster because they are unwilling to move their concerns up the chain of command if managers are initially unresponsive. Chernobyl offers a counterexample to this finding. When engineers sought approval for the test, they failed to gain the approval of the proper officials. Instead of coordinating the test with the chief reactor designer, scientific manager, or Soviet nuclear oversight regulator, as protocol stipulated, approval was only obtained from the plant director. By failing to engage with the proper experts, the Chernobyl engineers missed three opportunities for an expert to step in[8]. Following protocol and engaging the correct people adds layers of protection when conducting dangerous tests and experiments; in the case of Chernobyl, it could have prevented the catastrophe.

Pressure to Succeed[edit | edit source]

After witnessing the plants response to their commands, Chernobyl nuclear operators ignored several opportunities to cancel the test. Had the test been cancelled, the nuclear operators would not have been rewarded for averting disaster; rather, they would have faced severe punishment for delaying the test and questioning the judgment of the superior who issued the order. Anatoly Dyatlov, the chief engineer on the night of the accident, had a reputation as an irritable task master. Documents report that he had been especially impatient leading up to the accident[9]. Even though Soviet workers did not face pressure from management seeking to maximize financial profit, they faced constant pressure from a communist system that encouraged workers to maximize production for the greater benefit of the state. Any failures, perceived weakness, or disloyalty in a worker invoked quick and severe punishments and demotions. When employees feel threatened to produce at all cost, they become less risk averse. The pressure to succeed felt by the Chernobyl operators caused them to be complicit with a flawed and unsafe plan, even though they did not understand what was happening.

Soviet Cover-up[edit | edit source]

Chernobyl radiation map from 1996

News of the reactor meltdown quickly reached Kremlin leaders, but Soviet leaders opted not to share the information with the Ukrainian people. On May 14th, Soviet Union leader Mikhail Gorbachev announced on television that effective measures had been taken to contain the radiation and that there was no risk. In fact, no containment efforts occurred, and later studies found that thousands of children in Ukraine developed hyperplasia of the thyroid gland. Despite knowledge of radiation hazards, Ukrainian officials ordered the inhabitants of nearby Kiev to continue with holiday festivities, even ordering schoolchildren to parade down the streets to show civilians that it was safe outside. Ukraine's health minister, Anatoly Romanenko, believed that fear of radiation was more dangerous than radiation itself[10].

Ethics[edit | edit source]

The factors leading up to the explosion can not be pinned to just one factor. The failure was a culmination of miscommunication, lack of safety culture, Soviet cultural problems, negligence, and unqualified staff that combined to produce deadly consequences. Some claim management, operators, and engineers are to blame. Some of the questionable ethics at the time of the accident and shortly after can be found in turning off the safety system to perform the test, the fact that the firemen initially tasked with the disaster were never taught how to cope with the situation, the Soviet cover-up, and the haphazard construction of the sarcophagus built in response. Following the catastrophe, many changes in policy and procedures were formed; a leading group in the effort, the World Association of Nuclear Operators(WANO), an international organization focusing on safety through open information sharing[11], was founded in 1989. In the aftermath, the Soviet Union made alterations to the RBMK design - the design of the Chernobyl reactor - and gained a heightened sense of appreciation and caution for nuclear power[12].

Impact[edit | edit source]

The Chernobyl disaster continues to impact human life and the environment. Over 7 million people and 63 thousand square miles of land have been effected[13]. Over 300,000 people were evacuated by the government after the accident[14]. While the devastated reactor leaked fumes spreading radiation across Europe, a massive effort was undertaken to decontaminate the affected areas. It is estimated that nearly 400 times the radiation resulting from the atomic bomb dropped on Hiroshima was released from the site[15]. First responders to the accident suffered massive health problems, and medical reports showed an increase in cases of cancer[16].

Conclusion[edit | edit source]

The findings from the Chernobyl incident reveal that the societies must work to avoid toxic political environments in which laborers are led into error. Professionally, and especially with regard to nuclear power, safety should always be the number one priority. Oftentimes, governments engage in cover-ups to save face; ultimately, they are always exposed and end up damaging their reputations more than if they had been forthcoming from the start; this is a lesson the governments of the world today should bear in mind. Reputation should not precede the safety of its citizens and environment. Finally, governments and businesses need to work to set realistic plans that workers can accomplish. The Soviet Union developed overly ambitious Five Year Plans which would include goals which were either impossible to achieve, or only possible to achieve with shoddy quality. Case in point is the Five Year Plan from 1981-1985, which ended right before the Chernobyl incident. During this period, the Soviet Union set forth unrealistic energy production goals, which resulted in unsafe power plants[17]. Governments and corporations today should bear this lesson in mind as they set infrastructure and expansion goals.

References[edit | edit source]

  1. Sovacool, B. K. (2008). The costs of failure: A preliminary assessment of major energy accidents, 1907-2007. In Energy Policy, 36(5), 1802-1820.
  2. Johnson, Thomas. (2006). The Battle of Chernobyl. Youtube. https://www.youtube.com/watch?v=gJjhY8XqGRw
  3. a b World Nuclear Association.(2016). Chernobyl Accident 1986.World Nuclear Association. www.world-nuclear.org
  4. a b Chernobyl Gallery. Chernobyl Disaster: Cause. The Chernobyl Gallery. chernobylgallery.com
  5. Shlyakhter, A. Wilson, R. Chernobyl: the inevitable results of secrecy. Broad Institute. www.broadinstitute.org
  6. Fountain, H. (2014). Chernobyl: Capping a Catastrophe. http://www.nytimes.com/interactive/2014/04/27/science/chernobyl-capping-a-catastrophe.html
  7. Fountain, H. (2014). Chernobyl: Capping a Catastrophe. http://www.nytimes.com/interactive/2014/04/27/science/chernobyl-capping-a-catastrophe.html
  8. Liptak, B. (2009). Chernobyl Did Not Need To Occur. Control Global. www.controlglobal.com
  9. Human-factor, design and nuclear disasters. (2013). Graphene. http://graphene.limited/designer-portal/root-cause-analysis/case-studies/human-factor-industrial.html
  10. Dahlburg, J. (1991). Soviet Leaders Accused of Chernobyl Cover-Up:Disaster: Lies linked to many deaths in nuclear accident. Ukrainian report names Gorbachev, others. Los Angeles Times. articles.latimes.com
  11. World Association of Nuclear Operators (WANO). (n.d.). http://www.wano.info/en-gb
  12. International Atomic Energy Agency | Atoms For Peace. (1998). https://www.iaea.org/
  13. Chernobyl Nuclear Accident. (2001). Green Facts:Facts on Health and the Environment. http://www.greenfacts.org/en/chernobyl/index.htm
  14. Fact Sheets. (2015, March). Nuclear Energy Institute. http://www.nei.org/Master-Document-Folder/Backgrounders/Fact-Sheets/Chernobyl-Accident-and-Its-Consequences
  15. Morris, H. (2013, November 7). After Chernobyl, they refused to leave. CNN. http://www.cnn.com/2013/11/07/opinion/morris-ted-chernobyl/index.html
  16. Greene, D. (2011, April 26). In Ukraine, Scars Of Chernobyl Disaster Remain Raw. npr. http://www.npr.org/2011/04/26/135728490/chernobyl-25-years-later-unanswered-health-questions
  17. Reading Eagle. (1981). Tikhonov Bids For U.S. Trade. Reading Eagle