There’s considerable talk about safety culture in the nuclear industry in recent years. Just four days before last year’s accident at Fukushima, the NRC voted 5-0 to approve a revision expanding its policy statement on the subject.
The good news is that the NRC remains in favor of a good safety culture. The bad news is that safety culture is very difficult to weigh, measure, or otherwise quantify. All too often, safety culture problems are not recognized until becoming what Thomas Jefferson might label as “self evident.”
I attended a session on conservative decision-making during the NRC’s Regulatory Information Conference last month. It included presentations by representatives from Dominion Energy and Duke Energy regarding problems experienced last year at the Millstone and Oconee nuclear plants, respectively. (Both events are discussed in UCS’s March 2012 report, “The NRC and Nuclear Power Plant Safety: Living on Borrowed Time”.)
The Dominion and Duke presentations provided stark contrasts between a operator with a strong safety culture and one merely giving it lip service. Dominion spoke about safety culture in practice while Duke talked about it in theory. Because few managers adopt a poor safety culture by choice, it’s not confusion over safety culture theory that’s important. The key to a good safety culture isn’t knowing what it is, but knowing how it’s done. Duke’s presentation demonstrated they knew most, if not all, the key words and phrases while Dominion’s presentation demonstrated they knew how to successfully apply the concept.
The event at Dominion’s Millstone Unit 2 involved a test. During this test, the operators lost control of the reactor’s power level with the result that it increased unexpectedly by 8%. This outcome occurred despite the operating crew having gone to the control room simulator only days earlier to rehearse the test procedure. Each operator had a second individual assigned to read the test procedure and peer-check that the operator was taking the correct actions. Despite these precautions, errors were made during the test and compounded by additional errors.
Dominion’s presentation described their extensive probe into what went wrong and why. Although the test had been rehearsed days earlier in the control room simulator, an operator became unable to conduct the actual test due to a family emergency. Another individual who had not participated in the rehearsal substituted for this worker. The crew leader reassigned some of the remaining operators so that several performed tasks they had not rehearsed.
In addition, the simulator training at Millstone introduced equipment failures and malfunctions to test the operators’ ability to detect and respond to such casualties. But operator mistakes had not been role-played on the simulator; thus, the effectiveness of peer-checking was never formally tested and verified to be adequate.
Dominion further determined that the process weaknesses revealed by these mistakes by the Operations department might also be present in two other areas at Millstone. Dominion pro-actively undertook to resolve these collateral problems, too.
The event at Duke’s Oconee plant involved a modification installed in 1983 as a lesson-learned from the Three Mile Island accident. The modifation was intended to create an additional layer of safety by providing another means of assuring reactor core cooling during an accident. But due to a design error the conditions in the reactor following an accident could disable the installed equipment.
When this design error was finally recognized in 2011, a hasty replacement of the troubled components took place in order to keep all three reactors operating. But the replacement components were later found to also fail when subjected to post-accident reactor conditions, albeit for another cause. Thus, the alleged fix merely swapped reasons for the failure of the safety system.
Duke’s presentation was nearly the exact opposite of Dominion’s. Both events involved standards and practices that had been successfully employed in the past but failed in these cases. Whereas Dominion explicitly identified why those failures had occurred this time and what they’d do in the future to avoid recurrence, Duke whipped out the ol’ “lack of sufficient rigor” rhetorical answers. It is well recognized that the initial modification and initial repair at Oconee resulted in bad outcomes. It remains a secret, at least from Duke’s presentation, why those outcomes occurred and what steps were taken (if any) to prevent them from happening again.
The NRC’s safety culture webpage features several case studies. These presentations by Dominion and Duke provide excellent examples of how to implement a strong safety culture and how to merely talk about one.
Support from UCS members make work like this possible. Will you join us? Help UCS advance independent science for a healthy environment and a safer world.