Interested in linking to "Control Systems, Oh, No...Not Again!"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
Donna Kasuska, a chemical engineer with Pegasus Group Integrated, and ChemConscious Inc. (www.everydaychemicals.com) in Downingtown, Pa., says, "Sophisticated control systems can significantly reduce plant failures by eliminating or improving the human interface. But even the most sophisticated system has to first be considered by humans, and is also subject to use and maintenance by humans."
Large automation and control systems are what are called complex systems. Complex systems do not behave the way simple systems do. A simple system says, for example, that cause A leads to effect B. Not doing A means that B does not happen. A complex system has so many dependencies and interrelations that it is not possible to predict accurately that A will always lead to B. Combinations of causes and combinations of effects make the behavior of complex systems often non-linear. Complex systems often behave in ways more predictable by chaos theory than by a linear engineering model. "Understanding these systems and analyzing or accurately predicting their behavior is often difficult," say Karen Marais and Nancy G. Leveson, from MIT, in their paper "Archetypes for Organizational Safety." Leveson served on the Baker Commission investigating the BP Texas City accident in 2005.
"We are seeing a growing number of normal, or system, accidents which are caused by dysfunctional interactions between components, rather than component failures. Such accidents are particularly difficult to predict or analyze. Accident models focusing on direct relationships among component failure events or human errors are unable to capture these accident mechanisms adequately," Marais and Leveson continue.
Their paper goes on. "One of the worst industrial accidents in history occurred in December 1984 at the Union Carbide chemical plant in Bhopal, India. The Indian government blamed the accident on human error in the form of improperly performed maintenance activities. Using event-based accident models, numerous additional factors involved in the accident can be identified. But such models miss the fact that the plant had been moving over a period of many years toward a state of high-risk where almost any change in usual behavior could lead to an accident."
So what happens when a plant, after an accident, for example, institutes safety actions? "Well-intentioned, commonplace solutions to safety problems often fail to help," Marais and Leveson point out. "They have unintended side effects or exacerbate problems. A typical 'fix' for maintenance-related problems is to write more detailed maintenance procedures and to monitor compliance with these procedures more closely."
David Strobhar, president of Beville Engineering (www.beville.com), the director of the Center for Operator Performance, has a similar comment. "I recently began thinking again on safety culture," he says. "I was at a plant that went to extremes to communicate and emphasize safety. I didn't get the feeling that they were all that safe. So I am still trying to determine what it is that makes a safety culture. I think of a Dilbert cartoon where it was said that if you have to have a 'name' for it, you probably don't have it."
"Back in the old proprietary days," Scott Hillman says, "you didn't talk much about cybersecurity when discussing plant security. Today, cybersecurity is at the forefront of the conversation largely due to the advent of open systems some 15 years ago. When we put open systems into the control environment, it resulted in much greater risks, and hence, the need for more effective cybersecurity measures."
"I believe," says Joel Langill, a TŰV-certified functional safety engineer and industrial security consultant for Englobal Automation (www.englobal.com), a major EPC firm located in Houston, "that there is a great deal of synergy between functional safety and industrial security. This is demonstrated not only by ISA's creation of a joint working group between ISA84 (safety) and ISA99 (security), but also by several industry trends, such as the merger of exida and Byers Research. Both disciplines address the protection of assets through risk reduction."
Functional security researchers like John Cusimano of exida and Joe Weiss of Applied Control Solutions (http://realtimeacs.com) and author of Control's "Unfettered" blog and the newly published book, Protecting Industrial Control Systems from Electronic Threat (Momentum Press) continue to point out that there is a long way to go.
"Vendors have made improvements in the level of security in their products by closing some ports," Carl Moore says, "but they haven't fully implemented ANSI/ISA99.00.01.2007, nor have the onsite oil-and-gas users. Most locations think a good firewall is enough to stop malicious hackers, but evidence has proven otherwise. And sites still insist on having dial-in remote troubleshooting for vendor support, and this always leaves a path for malicious hackers."
Is functional safety and security an insoluble problem? Scott Hillman thinks training is one of the answers. "I don't think it is possible to develop a safety and security culture without training. Lives are on the line, and these are procedures that must be drilled into every single plant employee. You might install the fanciest equipment in the world, but if you put it all in the same window in front of the operators without training them how to read or respond to the data, it will undoubtedly lead to chaos. So training is a very critical part of the equation."
Carl Moore agrees. "Both safety and security training are vital for all process-related plants," he says. "Management often understands the safety piece of this, but rarely do they understand the security piece."
Managing risk in complex systems like the process industries is a dynamic process, incorporating safety systems, security systems, product design and ongoing training. But it all starts with management. There is an old Yiddish proverb, "A fish stinks from the head down." In the 1960s, the Dow Chemical Company's Levi Leathers realized this when he mandated operating safely as Dow's primary mission. Dow's "stateful control" systems and safety culture have made the company one of the safest in the petrochemical industry. Leathers showed that if management insisted on a specific standard of operations behavior, it would become cultural and ingrained in the operating practices of the company. That's the first step in achieving a safety and security culture and making it work for the long term. "When a company has a publicly stated goal of 'Safety is our number one priority,'" Paul Gruhn says, "ask the plant manager if he'd be willing to live on the property with his family. Actions speak louder than words."