Results from ThreatConnect webinar on mitigating risks in critical infrastructures and on-going actual risks

July 26, 2021
July 21, 2021, I participated in a live webinar discussion, hosted by ThreatConnect’s Dan Verton, on mitigating risk in critical infrastructures with Bob Kolasky from DHS CISA and Tim Grieveson from Aveva. The webinar link can be found at https://threatconnect.com/podcast/threatconnect-podcast-ep-21-mitigating-cyber-risk-in-critical-infrastructures/  I met Dan in 2001 when I was at EPRI and he was with ComputerWorld. Unfortunately, many of the unresolved control system cyber issues in 2001 still exist in 2021. My focus has always been on the reliability and safety of the critical infrastructures not the loss or compromise of data. That is, keeping lights on, water flowing, etc. which is not the same as keeping networks available. Tim Grieveson and I were in agreement on the need to address the culture gap between engineering and network organizations. Unfortunately., the CISA discussions didn’t address some of what is necessary to secure control systems including the lack of addressing actual control system cyber incidents and the lack of security in process sensors.

There is a lot of discussions, in Washington and elsewhere, about critical infrastructure protection, and that’s all to the good. One of the gaps in our understanding of this challenge, however, remains a gap between the engineering community and the networking community. The ultimate goal for both organizations is to keep the physical process running (e.g., lights on, water flowing, etc.). However, a network-only focus won’t do that without ground truth about the physics which is the engineering focus.

July 21, 2021, I participated in a live webinar discussion, hosted by ThreatConnect’s Dan Verton, on mitigating risk in critical infrastructure with Bob Kolasky from DHS CISA and Tim Grieveson from Aveva. The webinar link can be found at https://threatconnect.com/podcast/threatconnect-podcast-ep-21-mitigating-cyber-risk-in-critical-infrastructures/

I am a nuclear engineer and spent many years working on nuclear safety. Mitigating risk – at any cost – is something that is familiar to me.

I met Dan in 2001 when I worked at the Electric Power Research Institute (EPRI). At the time, he was working at ComputerWorld magazine. My focus has always been on the reliability and safety of critical infrastructure and not the loss or compromise of data. That is, keeping lights on, water flowing, etc. which is not the same as keeping networks available. I have been frustrated about the lack of participation in control system cyber security by the engineering community, as well as the lack of any outreach programs by the networking community to the engineering community. As Tim and I pointed out, the culture gap between engineering and network organizations is a major stumbling block to securing control systems. Yet, the recent Executive Order on cybersecurity (EO 14028) has exacerbated the gap between engineering and network organizations. Unfortunately, many of the issues I had in 2001 still exist in 2021.

In 2015, DHS stated in their monthly ICS Monitor blog to never connect control systems to the Internet. In the Summer 2021 issue of Electric Energy, Dr. Edmund Schweitzer, the father of the modern electronic protective relay, stated the following: “Computers, their networks and things connected to networks are juicy targets for bad actors. Power systems can run without computers and networks. (I refer to those days as BC…Before Computers.) Computers and networks are tools that can enhance performance without sacrificing reliability. However, we must not let these tools become additional inputs to the “AND gate” to us getting power to the wall plug. I believe we should never connect critical infrastructure to the Internet (and this includes cloud computing) and we should audit this (bold italics from Dr. Schweitzer). If it is critical communications, then use your own network and secure it. Guard it all with the same vigor that you build a substation and lock the gate… Not all equipment needs an Ethernet port...”

Almost 12 million control system cyber incidents are on record. Collectively, these incidents have killed more than 1,500 people, and have caused more than $90 BILLION DOLLARS (USD) in direct damage. However, as there are no control system cyber forensics below the Ethernet level, most of these cases were not identified as being cyber-related. This is why there is such a critical need for training control/process system engineers to recognize physical incidents (e.g., a valve closing, a relay opening, etc.) as possibly being cyber-related.

There is no cybersecurity, authentication, or cyber logging in process measurements (e.g., pressure, level, flow, temperature, voltage, current, etc.). You can’t be cyber secure, resilient, or safe if you can’t trust your measurements.  Moreover, how can you identify cyber incidents and meet the TSA pipeline requirements when cyber-related pipeline ruptures weren’t identified as being cyber-related by NTSB as there were control system, not just network issues involved?

Bob mentioned the Department of Energy (DOE) is leading a task force called The Security Energy Infrastructure Executive Taskforce. As Bob said, one of the Task Force teams is working on evaluating technology and standards to secure ICS, a second is working on new classes of security vulnerabilities in ICS, and a third is working on developing a national cyber-informed engineering strategy. Particularly because of my concern with the Aurora vulnerability, I did a review of some of the new technology such as the Constrained Cyber Communication Device (C3D). From what I could tell from the publicly available information, the new technology did not appear to address some of the actual malicious and unintentional control system cyber incidents that have occurred. Additionally, the process sensor cybersecurity gap has not been addressed by any of the government organizations including CISA, DOE, or DOD (on the unclassified side). Conversely, there has been minimal participation from the government involved in this process with the industry organizations working on cyber security and safety. Control systems can be impacted by older versions of cyber vulnerabilities as well as Advanced Persistent Threats (APTs). Older classes of security vulnerabilities are still going unaddressed, and yet they have caused significant incidents. Many cyber security experts ignore these older threats as they are not “interesting enough” even though they have caused significant impacts (https://www.synack.com/were-in-synack-podcast/?utm_source=organic_social).

Bob stated that part of being a good risk manager is having the best available information about risk, in this case about threats, vulnerabilities, and consequences. He wants to get network defenders the best information about the current state of cyber threats. Unfortunately, that hasn’t always been the case with the government, in this case DOE. The Chinese did a “Stuxnet” on the US power grid by installing hardware backdoors in at least one large power transformer. A second Chinese transformer was sent to the Sandia National Laboratory (SNL) to be “dissected”. Yet there has been no information sharing by DOE on what was found after dissecting the Chinese transformer. How can industry protect itself without this information sharing?

Bob is focused on networks, and not the physical process (e.g., keeping networks up, not lights on). Dr. Schweitzer thought this kind of focus would eventually mislead thinking about physics issues. There are a class of physics (or physical-cyber) issues where there is no malware involved but simply using remote access (cyber) to cause a physical process to enter a “forbidden operating zone” (physical), where “Mother Nature” causes the damage, often in a very short time frame (e.g., milli-seconds). The Aurora vulnerability is one example of physical-cyber incidents where the protective relays are manipulated to result in physical damage to any Alternating Current (AC) equipment or transformers connected to the relays. As such, the Aurora vulnerability can bring the grid down for 9-18 months, and yet cannot be found by network monitoring (see the above discussion of C3D and the Synack podcast that disregards the impact of this problem).  

Bob mentioned that in April 2021, CISA hosted a discussion with cyber risk metric experts who help boards think through cyber risk. Bob stated that CISA is beginning to come up with some core concepts about national security cyber risk in terms of metrics. This is great but where were the engineers and why aren’t actual control system cyber incidents that cause physical damage and injuries being addressed?

Bob’s discussion about insurance concerned cyber insurance but did not mention business continuity or liability. Yet the reason control system cybersecurity is so important is that control system cyber incidents have killed people and destroyed critical long-lead equipment. According to Bob, CISA is hoping that the insurance companies are creating incentives to only ensure those companies that are following good practices, that are reporting incidents to the government, that have good playbooks in place, or at least pricing that into their prices so good behavior is incentivized. Yet this can’t be done without cyber forensics at the control system controller layer and control system cyber security training for the engineers.

The Russians, Chinese, Iranians, and others are aware of these control system cybersecurity gaps. Why are these control system cyber security gaps not being adequately addressed?

Joe Weiss