Secure the SIS

Understand the strength of your last line of cyber defense

By William L. Mostia, Jr., P.E.

1 of 2 < 1 | 2 View on one page

Cybersecurity is a complex issue, made more complicated by having four technical domains (IT, enterprise, process control and safety systems) with different purposes, goals and potential hazards and risks, and often with different personnel involved. Here are some of the considerations in performing the security assessment required in Clause 8.2.4 of the new Second Edition of IEC 61511-1: 2016 standard for safety instrumented systems (SIS). Some of the considerations also apply to non-SIS safety systems and to the broader industrial control and automation systems (ICAS).

The purpose of the SIS is to ensure that we make product safely. For the most part, it is not involved in how the process is controlled or how process and production information is normally collected or manipulated outside of the SIS, only if the process exceeds safe limits. We have some secondary concerns that the safety system does not affect production through excessive spurious trips or through high maintenance rates.

The broad cyber domains are conceptually illustrated in Figure 1, where it can be seen that the SIS is essentially embedded in the ICAS system and generally is considered part of it. It is important to understand the role the SIS plays in the ICAS to understand how a cyber attack might occur in a SIS, and what cyber consequences could be physically realized and lead to a hazardous condition.

Introduction to IEC 61511 first Edition

When the First Edition of IEC 61511-1 came out some 14 years ago (and ANSI/ISA S84: 1996 before it), cybersecurity concerns and cyber threats were not very well recognized in the process industries. The 61511-1 2003 1st Edition standard did recognize potential security threats to SIS. These threats primarily revolved around inadvertent or unauthorized access and changes that affected the safety integrity of the SIS. Unauthorized or mismanaged changes have long been recognized as a safety hazard (e.g. Flixborough, 1974), which led to the current management of Management of Change (MOC) regulations.

The security concerns at that time (and they still are concerns today) were primarily unauthorized physical access (e.g. locked cabinets, building access control, etc.), programming access (e.g. keylocks, programming panels and computers, dongles, etc.), and password controls. Most security threats were considered internal, e.g. inadvertent, unauthorized, and/or undocumented changes, problems caused disgruntled employees, etc.

Recent rapid advancement of digital control system technology, advances in computing power and interconnection of the world via the Internet, intranets, wireless and now, the Internet of Things (IoT) and Industrial IoT (IIoT) have opened a Pandora’s box of opportunities, but also unleashed the big, bad wolf in the form of increasing cybersecurity threats.

A key difference in today’s security threats and what was in the 61511-1 1st Edition is that we have now moved from a small arena (your company) that included mostly internal, physical threats to a much larger threat arena (the world) that can include both internal physical threats and external, largely invisible cybersecurity threats potentially from all over the world.

Second edition security assessment requirements

IEC 61511-1 2nd Edition recognized the increasing threat of cyber attacks to SIS, and added additional requirements to help reduce the risk. Compliance to the new IEC 61511-1 Second Edition 2016 will require that the SIS have a security assessment (Clause 8.2.4) and that the design of the SIS shall be such that it provides the necessary resilience against the identified security risks (Clause 11.2.12). Clause 8.2.4 details a general outline of what is to be accomplished in a security assessment, but not much on the nitty gritty of performing the assessment. The basic methodology is one of reductionism: breaking down of the SIS domain into smaller equipment pieces, analyzing the equipment’s vulnerabilities, evaluating the existing protections that limit exposure of the vulnerabilities to a physical security breach or cyber attack, providing additional protections to reduce the risk to an acceptable level based on the corporate risk criteria, and documenting the risk assessment. When using a reductionism methodology, care must be taken to not miss system-level threats.

Download: Béla Lipták on safety: Cybersecurity and nuclear power

The overall risk assessment process is similar to a process Hazards and Risk Assessment (H&RA). In some cases, a cyber attack in the ICAS could initiate a cause similar in effect to an equipment failure or human error in a process H&RA, where other layers of protection (assuming some of those have not been defeated in the process of the cyber attack) will come into play to bring the process to a safe state. A standard layer of protection analysis (LOPA) would prevent the physical realization of this cyber threat into a hazard. In this case, the ability of the SIS to resist defeat of the SIS safety function can be paramount to achieving a safe state of the plant.

SIS cyber domain

Philosophically, the SIS is best understood in the context of layers of protection, where the SIS is equivalent to one to three independent protection layers (IPL) that protect against identified hazards determined by a process risk analysis. Figure 2 illustrates a cyber attack on the SIS where a cyber attack on the basic process control system (BPCS) could initiate a cause equivalent to a normal equipment failure or human error. This same attack could lead to defeat of the alarm IPL, if it is in the BPCS and the attacker has sufficient skill in manipulating the BPCS.

If the SIS is designed to protect against that particular initiating cause, it should serve its purpose as an IPL and bring the process to a safe state. But if the attack is also directed at the SIS, defeat of the safety instrument function (SIF) protecting against the BPCS initiating cause could also defeat the IPLs for that hazard, leaving protection to the mechanical IPLs.

If the initial cyber attack was directed strictly at the SIS and not the BPCS, the SIS safety function could be defeated, leaving a latent dangerous failure in the SIS. Cyber attack of the SIS might also initiate a spurious trip (one or many), causing a safety incident or disrupting production. If software resets are used in the SIS (versus field resets), the cyber attack might auto-reset the SIF and trip again later, leading to a hazardous condition.

These SIS cyber attacks can be overt, e.g. in conjunction with a simultaneous cyber attack on the BPCS, or covert by defeating the safety function of the SIS while waiting for a normal safety demand or a later  cyber attack to occur on the BPCS. It is important to realize that the SIS is typically not the only IPL(s) protecting against hazards, which should be considered in any risk assessment, and that the primary risk we are concerned with is a physical realization of a hazard.

It is also important to understand how a SIS can be defeated in order to provide protection against those events. For example, some of the ways a SIS/SIF can be defeated are placing the safety functions in bypass without the operator being aware, placing the logic solver in an infinite loop, changing the trip and alarm setpoints, disconnecting the output from the logic, spoofing the inputs, forcing the outputs, etc. This understanding can lead to SIS designs that, in addition to the Zones/Conduits protection scheme, can help to protect against specific failure modes resulting from a cyber attack.

IEC 61511-1 security assessment steps

To perform a security assessment under the new IEC 61511-1 standard, one of the first things to do is to establish the outer boundary of the SIS under assessment. Where the zones/conduits in ISA/IEC 62443:2010 and ISA TR84.00.09 protection concept are used (Figure 3 and 4), the zone boundaries of the SIS are the boundaries for the security assessment. This is somewhat similar in concept to nodes in a HAZOP. An alternative to the HAZOP, top-down approach would be to use a failure mode and effect analysis (FMEA), bottoms-up approach, and look at how the system could fail to accomplish its purpose given a security event. Or use a combination of both methods to cover all your bases.

The security assessment requires a description of all the covered devices. These device descriptions should include a listing of all the hardware and software versions, including device sub-modules, for change management. There is a famous old saying that the author made up this morning, and that is, “If you don’t know what you have, you cannot protect what you have.”

Security-critical information (e.g. alarm & trip setpoint, field device parameters, communication parameters, etc.) within the devices and system should be identified to allow detection of unauthorized changes in the parameters critical to safety. The description should include all connections to other devices within the SIS, to devices outside the SIS boundary, and to all devices used for non-operational purposes (programming terminals, update connections, field device communicators, calibration equipment, asset management systems (AMS), any connection to the outside world whether they are active or not, etc.). This should include hardwired connections that can be influenced by a cyber attack and any connections that help provide protection against cyber attacks (e.g. hardwired keyed bypass enable switch).

Known cyber vulnerabilities for each device should be listed. As part of generating this list, known cyber vulnerabilities should be discussed with the device manufacturer and researched within the industry. Physical security vulnerabilities for each device and for the system as a whole should also be listed. There is commercial software becoming available to make this inventory effort easier and to help automate monitoring of unauthorized changes and potential security issues. An example of this type of software is Cyber Integrity by PAS.

Develop a description of identified security threats that could exploit listed equipment vulnerabilities and result in security events (including intentional attacks on the hardware, application programs and related operating system software, as well as unintended events resulting from human error). It is important in this type risk assessment to understand the security threat vectors (e.g. how the threat gets into the system (access points), how the point is accessed, the nature of the attack, any enabling conditions that facilitate the threat vector, the devices or path the threat vector takes to propagate through the system to reach a point for a physical realization into a hazardous or undesirable condition to occur, and what is necessary to happen to result in a safety incident).

The methods to prevent the threat vector from reaching the device’s vulnerability and methods to mitigate the cyber attack, or in the worst case to recover from the attack, should be listed. It is also important to note how such cyber intrusions/threats could be detected in the system, even if the system successfully repelled the attack, as these may be probes of your system.

1 of 2 < 1 | 2 View on one page
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

  • Great article William, one of the best I've seen on this topic, especially the way you tie it into the IEC standards. My only comment or addition would be that there should be no way to attack via cyber the SIS, or any SIS component, except for physical access. There is a tendency to layer security controls, the vaunted defense-in-depth, and say attack paths are adequately secured / likelihood is sufficiently low. In a SIS this should not be accepted. Dale Peterson @digitalbond

    Reply

RSS feed for comments on this page | RSS feed for all comments