Process safety and cyber security – they are not the same

Moore Industries issued a very good white paper on a logic solver for tank overfill protection ( Additionally, ABB and Siemens continue to issue articles about the use of integrated control and safety systems.

ISA99 has recognized that process safety and security are not the same and is working with ISA84 – the Safety Integrated Systems (SIS) Committee - to address the related issues of safety and security. (It is not clear that the nuclear industry is addressing this very important issue.) From a process safety perspective, an analog level sensor, connected to a “manual logic solver”, connected to a final mechanical actuator is the same as a smart level sensor connected to a Windows-based logic solver, connected to a smart final actuator all connected to the Internet. However, it should be obvious these two systems have very different cyber risks.

Many tank level monitoring and control applications are considered to be safety-related either to prevent overfill (there has been at least one dam that burst because of overfill) or to prevent the tank level dropping too low and preventing pumps from operating (there have been numerous cases of plant damage from low tank level including in nuclear plants). Consequently, tank level monitoring is considered to be a safety issue in industries such as refining, chemicals, tank farms, nuclear plants, water plants, and dam level monitoring.

The Moore Industries paper did a very good job of addressing the detected and undetected failure rates of level sensors, safety trip alarms (logic solvers), and final actuators which was the basis for assuring tank overfill protection. The Moore Industries paper provides level sensor, logic solver, and final actuator reliabilities of approximately 10-7 failures/hour. With these very low failure rates, the use of digital safety systems can be justified from a safety perspective. However, probabilistic failure rates are not relevant to cyber threats which can happen at any time. Consequently, if the failure rate can be relatively high (say 10-1 failures/hour), safety considerations need to be reconsidered.

A cornerstone of safety was the independence of the basic process control system (BPCS) and safety instrumented system (SIS). Stuxnet not only compromised the BPCS but also bypassed the SIS.  Integrating BPCS and SIS makes them subject to common risks. Some certification vendors have demonstrated that SIS and BPCS can be “logically separated” and still be certified to IEC safety standards but this certification has not yet been extended to cyber security considerations. Consequently, this makes the stakes much higher for protecting the BPCS from cyber attack because of the BPCS affinity with the SIS.

Currently, there is very little cyber security in either sensors or actuators. How can the safety system be secure when the sensing and actuation is not secure? Moreover, DHS-CERT continues to issue cyber security vulnerability disclosures about various controllers and network and communication devices used in safety systems on a frequent basis. Whether a specific component has “cyber vulnerabilities” may or may not be an issue, depending on the risk and remote accessibility of these components. However, given the ability to access supposedly remotely inaccessible components (eg, Stuxnet and Iran), remote accessibility for safety systems needs to be carefully considered.

There is a need to discuss the use of digital safety systems for critical, high risk applications.

Joe Weiss

Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.


  • Where was SIS bypassed in StuxNet? To my knowledge the speed control had no separate safety system guarding it. A more practical example, and unfortunately a frequently occuring example, are asset management systems both managing control and safety instruments. Attacking such a system allows influencing the safety trip point by modifying field transmitter settings. This is something commonly seen during security assessments where older types of logic solvers are used that don't support HART pull through and interface directly with the I/O multiplexer. Logic solvers supporting HART pull through enforce a maintenance mode for transmitter changes so offer protection against this type of attack. These scenarios, and several others, are important during security assessments. Unfortunately often security assessments are carried out by companies and consultants not having the mix of cyber security and process control skills which cause a level of security assessments not appropriate for critical infrastructure facing targeted attacks.


  • Wired and Wireless Hart have been demonstrated to be cyber vulnerable. Field devices have minimal, if any, security. As the 2015 Ukrainian cyber hack demonstrated, serial-to-Ethernet converters can be a major cyber security problem. This is why the safety session we will be holding at the October ICS Cyber Security Conference is so important. Joe Weiss


  • Very good opinion, and I especially like the point related to security of supposedly remotely inaccessible components. I used to work at a Canadian facility, where the control engineers originally could only access the process control systems from their field offices next to the board control room. Around the time of the appearance of the first personal PC's things started to chance. Suddenly control engineers could access the process control systems from their personal office computers, which had removable storage devices removed - at the time that was diskettes. Today with MoC no changes - at least at leading facilities - are implemented on the process control system without a MoC review. At some places that even include the controller tuning we did on a day by day basis. I think, that Stuxnet should have showed the profession, that you don't have to be directly connected to the internet in order to be vulnerable to a cyber attack. As long as there is a path, and your system has vulnerabilities, then you are vulnerable to cyber attacks. With the recent drive by companies such as ExxonMobil to automatically configure field instruments remotely after installation the vulnerability issue goes all the way to the system on which this configuration is created, and who has access to the configuration file or configuration program. The bottom line is that the security of the process control system is no longer just a plant issue.


RSS feed for comments on this page | RSS feed for all comments