On the morning of August 7, 1997, a member of the training department at the Haddam Neck plant in Connecticut took a picture of a fire detection panel inside the main control room. The camera used its flash to light up the darkened interior of the cabinet. An alarm sounded. Three to five seconds later, the fire suppression system discharged Halon into the control room from overhead nozzles.
The Halon gas, which functions like carbon dioxide to extinguish fires by displacing oxygen, blew into the control room scattering papers and dislodging ceiling tiles. A falling ceiling tile struck, but did not seriously injure, an operator on his way out of the room. Within thirty seconds, everyone had abandoned the control room.
After the operators left the control room, they assembled in an adjacent room where they could monitor the control panels through a window. When an alarm light blinked on, an operator would rush back into the control room, without self-contained breathing apparatus, to respond to it.
About 35 minutes later, the ventilation system had removed enough of the Halon gas to allow operators back into the control room.
Subsequent investigation determined that the flash from the camera affected a microprocessor in the initiation circuit for the Halon system. The fire suppression system was supposed to have a one minute delay between warning alarms and Halon discharge to enable workers to safely exit the area, the flash caused a premature discharge.
To prevent future occurrences, the plant’s owners posted signs on all fire system control panels warning folks that photography was prohibited inside the cabinets.
It’s been said that a picture is worth a thousand words. In this case, the majority of them were probably expletives.
This event illustrates how hard it is to anticipate every possible failure mode. Even if every worker at the site had been consulted regarding the threat posed by taking a flash photograph of a cabinet inside the control room, it’s likely that none would have identified this outcome and thus enabled it to be avoided.
The backstop to such explainable misses is empirical learning. Following the Three Mile Island accident in 1979, the nuclear industry and the NRC upped their games with respect to sharing good and bad practices. The extensive information sharing allows workers to learn by trial and error without each having to experience the trials and errors. Equally important, the sharing increases the workers’ awareness, better equipping them to ask and answer the questions necessary to reduce the number of errors they experience.
“Fission Stories” is a weekly feature by Dave Lochbaum. For more information on nuclear power safety, see the nuclear safety section of UCS’s website and our interactive map, the Nuclear Power Information Tracker.
Support from UCS members make work like this possible. Will you join us? Help UCS advance independent science for a healthy environment and a safer world.