Lessons from Texas City

June 11, 2007
Safety requires passionate and effective leadership, says MIT professor Nancy Leveson, a member of the Baker Commission that investigated the accident at BP Texas City. Managing and Controlling Safety needs clear definition of expectation, and we have to continue to adapt to changing conditions. We need to avoid the "culture of denial." We need effective measurement and monitoring of process safety performance. Injury rates are not useful and are misleading. We need good accident investigation ...
Safety requires passionate and effective leadership, says MIT professor Nancy Leveson, a member of the Baker Commission that investigated the accident at BP Texas City. Managing and Controlling Safety needs clear definition of expectation, and we have to continue to adapt to changing conditions. We need to avoid the "culture of denial." We need effective measurement and monitoring of process safety performance. Injury rates are not useful and are misleading. We need good accident investigation and follow through, especially the correction of SYSTEMIC causal factors. We need experienced oversight and control. All human behavior exists in a context, in a design context. We usually blame the operator, but we don't realize that we've created an environment where that operator error was inevitable. To really understand why process accidents occur and to prevent them, we need to understand current context (the system design we have) and create a design that effectively ensures safety. We can't change the human, so we must change the system. The enemies of safety are complacency, arrogance, and ignorance. Complacency factors include discounting risk, over-relying on redundancy, unrealistic risk assessment, ignoring low-probability, high-consequence effects, asuming risk decreases over time, and ignoring warning signs. The safest time is probably right after an accident. She described an accident in which nothing failed. Everything worked the way it was designed to work. It was the system and the system design itself that failed. If we made each of the components more reliable, it would not have had an effect. There are two kinds of accidents: component failure accidents and system accidents. System accident arise in interactions among components, and are related to interactive complexity and tight coupling, and are exacerbated by introduction of computers and software controls. We need more than component safety systems. We can't define human error in retrospect. It isn't fair. We usually define human error as deviation from normative procedures, but operators always deviate from standard procedures. We cannot effectively model human behavior by decomposing it into individual decisions and acts and studying it in isolation from physical and social context, within the value system in whihc it takes place, and within a dynamic work process. "Less successful actions are a natural part of the search by operators for optiman performance." Operators continually test their models against reality. We need to build systems so that they can do that. High tech automation is changing cognitive demands on operators. They are supervising rather than directly monitoring, doing more cognitive complex decision-making, dealing with complex, mode-rich systems, and increasing the need for cooperation and communication. Human factors experts find that designers focus on technical issues, not on supporting operator tasks, which leads to "clumsy" automation. Errors are changing, that is errors of omission vs commission. We got rid of the errors of commission by automation, now we have errors of omission instead where people just don't see the things that the computers are "not doing." The Airbus was the most accident prone plane in the modern era, and all the accidents were blamed on the pilots but they changed the software afterwards. So now we design for error tolerance, and we produce management by exception (alarm management), matching tasks to human characteristics, and designing to reduce human errors. Now systems must provide information and feedback, and increasing training and maintaining skills. Safety is now best seen as a control problem-- rather than a failure problem. This produces new approaches to hazard analysis, design for safety, and risk analysis and management. This new thing is called STAMP: Systems-Theoretic Accident Model and Processes. This is a broad view of conrol,  where component failures and dysfunctional interactions may be "controlled" through design, or through process. Safety is an emergent property that arises when system components interact with each other wihtin a larger environment. Goal of process system safety engineering is to indentify the safety constraints and enforce them in the system design. We build safety in by enforcing constraints on behavior. Controllers contributes to accidents not by failing by by not enforcing safety related constraints on behavior, or commanding behavior that violates safety constraints. In the example, water must be flowing into the reflux condenser whenever catalyst is added to the reactor. Systems are not static. Any socio-technical system is a dynamic process continually adapting to achieve its ends and to react to changes in itself and its environment. Systems and organizations migrate toward accidents (states of high risk) under cost and productivity pressures in an aggressive, cometitive environment. Preventing accidents requires designing a control structure to enforce constraints on system operations as the system migrates toward failure.

Sponsored Recommendations

Measurement instrumentation for improving hydrogen storage and transport

Hydrogen provides a decarbonization opportunity. Learn more about maximizing the potential of hydrogen.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Learn About: Micro Motion™ 4700 Config I/O Coriolis Transmitter

An Advanced Transmitter that Expands Connectivity

Learn about: Micro Motion G-Series Coriolis Flow and Density Meters

The Micro Motion G-Series is designed to help you access the benefits of Coriolis technology even when available space is limited.