Implications of the Triconex safety system hack – Stuxnet part 2?

I am a control systems engineer, not a threat analyst. Consequently, I am not trying to answer the question of why and who would want to attack Triconex PLCs. The reasons for attacking Triconex PLCs can be for a number of different reasons including a specific target facility utilized the Triconex PLCs or to understand the “target” landscape for attacking other facilities such as nuclear power plants that use Triconex PLCs. There are also a number of possible threat actors which threat analysts are better able to address.

As stated, August 4th, a facility in the Middle East had their Triconex safety instrumented system (SIS) cyber attacked. Various articles have identified the Triconex hack as a “watershed” moment. Ironically, the 2009-10 Stuxnet attack was a “successful hack” in that it actually was a long-term reconnaissance tool and destroyed equipment. The Triconex hack was actually a failure in that it neither damaged equipment nor became a long term reconnaissance tool.

Safety systems, either nuclear plant safety or SIS, are used in dangerous conditions to assure process safety. These applications include nuclear plants (Triconex was approved by the US NRC for nuclear plant safety applications in 2012), chemical plants, refineries, water systems, off-shore oil platforms, etc. 

The Triconex safety systems and Stuxnet cyber attacks bear interesting similarities:

- Both were nation-state hacking of control system networks through operators’ Windows-based workstations to download alternate control system logic (not traditional malware)

- Both affected safety systems that were connected to non-safety systems

- Both used hacking methodologies that can applied to other ICS vendors

There are a number of lessons to be drawn from the Triconex case:

- The Triconex hack was similar to Stuxnet 7 years ago yet many people were surprised.

- Triple redundancy does not equal cyber security.

- Mixing control and safety is unsafe because you effectively lose safety (I agree with Mandiant’s Dan Scali’s assessment). There have been incidents where connecting control and safety would have led to catastrophic damage. An example was a large power plant that lost ALL control system logic in EVERY DCS processor after a network action. The reason there was not catastrophic damage was the plant still used hard-wired analog safety systems.

- Nuclear safety does not allow mixing of control and safety whereas SIS standards continue to allow mixing basic process control and SIS (e.g., ISA84, IEC62351, and Namur 163). I have written numerous blogs about the problems with mixing control and safety.

- There are control system suppliers that provide integrated control and safety systems with no guidance to the end-users about the mixing of control and safety. There should be NO sharing of sensors, actuators and/or HMIs by safety and non-safety systems or you have effectively lost safety.

- ICS components used in safety systems continue to have DHS ICS-CERT cyber vulnerability notifications. How can you assure safety if safety systems continue to use components with known cyber vulnerabilities and are not fully isolated from non-safety systems including business networks?

- Windows workstations and HMIs are still one of the primary cyber attack vectors. Consequently, there is a need to have alternative methods for detecting control and safety system performance abnormalities. This monitoring should come from the process sensors BEFORE they are converted into Ethernet packets or it may not be possible to identify the cause of the abnormalities. As an example, Stuxnet could have been detected early as the “raw” process sensor data would not have matched the compromised Windows HMI displays.

December 14, 2017 ISA99 Working Group 4, Task Group 7 held the kick-off conference call on a new effort to address the cyber security of Level 0,1 devices. Safety issues, including mixing of control and safety systems, are expected be addressed as part of this working group.

Joe Weiss