Process safety and cyber security – they are not the same

July 19, 2016

There is a need to discuss the use of digital safety systems for critical, high risk applications.

Moore Industries issued a very good white paper on a logic solver for tank overfill protection (http://www.miinet.com/WhitePapersandArticles/TechnicalWhitePapers.aspx). Additionally, ABB and Siemens continue to issue articles about the use of integrated control and safety systems.

ISA99 has recognized that process safety and security are not the same and is working with ISA84 – the Safety Integrated Systems (SIS) Committee - to address the related issues of safety and security. (It is not clear that the nuclear industry is addressing this very important issue.) From a process safety perspective, an analog level sensor, connected to a “manual logic solver”, connected to a final mechanical actuator is the same as a smart level sensor connected to a Windows-based logic solver, connected to a smart final actuator all connected to the Internet. However, it should be obvious these two systems have very different cyber risks.

Many tank level monitoring and control applications are considered to be safety-related either to prevent overfill (there has been at least one dam that burst because of overfill) or to prevent the tank level dropping too low and preventing pumps from operating (there have been numerous cases of plant damage from low tank level including in nuclear plants). Consequently, tank level monitoring is considered to be a safety issue in industries such as refining, chemicals, tank farms, nuclear plants, water plants, and dam level monitoring.

The Moore Industries paper did a very good job of addressing the detected and undetected failure rates of level sensors, safety trip alarms (logic solvers), and final actuators which was the basis for assuring tank overfill protection. The Moore Industries paper provides level sensor, logic solver, and final actuator reliabilities of approximately 10-7 failures/hour. With these very low failure rates, the use of digital safety systems can be justified from a safety perspective. However, probabilistic failure rates are not relevant to cyber threats which can happen at any time. Consequently, if the failure rate can be relatively high (say 10-1 failures/hour), safety considerations need to be reconsidered.

A cornerstone of safety was the independence of the basic process control system (BPCS) and safety instrumented system (SIS). Stuxnet not only compromised the BPCS but also bypassed the SIS.  Integrating BPCS and SIS makes them subject to common risks. Some certification vendors have demonstrated that SIS and BPCS can be “logically separated” and still be certified to IEC safety standards but this certification has not yet been extended to cyber security considerations. Consequently, this makes the stakes much higher for protecting the BPCS from cyber attack because of the BPCS affinity with the SIS.

Currently, there is very little cyber security in either sensors or actuators. How can the safety system be secure when the sensing and actuation is not secure? Moreover, DHS-CERT continues to issue cyber security vulnerability disclosures about various controllers and network and communication devices used in safety systems on a frequent basis. Whether a specific component has “cyber vulnerabilities” may or may not be an issue, depending on the risk and remote accessibility of these components. However, given the ability to access supposedly remotely inaccessible components (eg, Stuxnet and Iran), remote accessibility for safety systems needs to be carefully considered.

There is a need to discuss the use of digital safety systems for critical, high risk applications.

Joe Weiss