Analog device vulnerability is a major threat to infrastructure, but the culture gap persists which could be an existential problem

Feb. 11, 2020
Unless the culture/governance gap between network monitoring and engineering is overcome, a sophisticated attacker can send us back to the 1850’s.

Cybernetics was originally defined as a transdisciplinary approach for exploring regulatory systems—their structures, constraints, and possibilities. Norbert Wiener defined cybernetics in 1948 as "the scientific study of control and communication in the animal and the machine." In other words, it is the scientific study of how humans, animals and machines control and communicate with each other. It’s worth remembering that Wiener’s illustrative example was an engine governor, an analog device if there ever was one.

That definition reflects the engineering world’s reference point to describe interdependent communications long before the Internet was invented. The definition of “cyber” in today’s world has changed over time to reflect an Internet and IP network-centric view while at the same time overlooking the physical processes and systems that were integral to the original core definition.

All physical processes, like power generation, water purification, and natural gas distribution, are monitored and governed by control systems. It has been that way for more than 100 years. Control systems and physical processes can work without the Internet, but the Internet cannot work without power. Yet, when the topic of cyber security is raised, the audience and participants are largely unaware of this interdependent relationship. If society is to combat cyber risk to critical infrastructures, then the entirety of society must recognize that the consequences of the risk extend into the software, firmware, and hardware that can operate with and without IP connectivity. In other words, the reference point and considerations for cyber risk must include IT/OT systems connected to IP networks and control systems that do not need connectivity to IP networks to operate, even though connectivity to IP networks can provide great efficiency and productivity.

The fear held in the engineering community is that the cultural and governance gap between the IT/OT network and engineering organizations is not only continuing, but may even be getting wider. This gap could not only create or miss real events, but could be exploited by a knowledgeable attacker making a cyber attack appear to be an equipment malfunction - this was Stuxnet. Bad actors do not care how they achieve their means, and the asymmetric attacks on September 11, 2001 using commercial airplanes bear that out.

Understanding and collaboration are essential between engineering and IT/OT network experts

Process sensors are used in electric, water, oil/gas, chemicals, pipelines, manufacturing, transportation, medical devices, building controls, defense, etc. Compromising them can lead to long term outages, equipment damage, environmental spills, and deaths. Process sensors are critical to the control and safety of all infrastructures as all actions start from what is measured. Process sensors are not just a device, but an entire ecosystem. That entire ecosystem has not been cyber-secured. There is no one-size-fits all process sensor solution as there are many issues that can occur with process sensors including inaccuracies, counterfeit sensors, compromise of sensor networks, electronic issues affecting parts such as amplifiers, etc. As Chris Sistrunk stated in response to the blog, https://www.controlglobal.com/blogs/unfettered/analog-sensors-can-be-hacked-and-ot-network-monitoring-cant-detect-it-a-hole-in-ics-cyber-security/, leveraging data from analog forensic sources is very important. It is not a new idea to correlate SCADA and Plant Distributed Control System (DCS) event logs with physical security logs and network security logs. This (network alerts for engineers and equipment alerts for network and physical security monitoring) should be part of an entire picture of monitoring and root cause analysis. As Chris noted in his Linked-in post https://www.linkedin.com/feed/update/urn:li:article:7817985355608566230/, NERC reliability standards require this monitoring as did the recommendations in the Final Report of the 2003 Northeast Outage. However, the NERC cyber security standards (CIPs) do not. I added DCS monitoring as the same type of disturbance/equipment monitoring occurs in power plants and other process facilities.  However, the coordination of monitoring of analog data with OT network logging is not wide-spread practice. This goes back to the question of are we actually protecting the “crown jewels”? That is, keeping lights on, water flowing, etc. 

Commentary by networking experts needs to be tempered with this new understanding

Clint Bodungen’s response to my blog on the analog amplifiers illustrates the culture gap that needs to change if we want to keep lights on and water flowing. Clint states: “While this is an issue, this is in no way a cyber attack and I don't see how this has cyber consequences. OT threat monitoring wasn't designed for such "attacks", so you can't really point a finger at OT threat monitoring vendors saying, "you don't monitor for this!" Furthermore, it should be no surprise that an electric device is susceptible to EMI unless it is specifically shielded for it. I don't really see the breakthrough here. Not to mention that it is a close proximity "attack" so the likelihood or even the broader implications are limited. I could just as easily piss on the sensor to alter its reading. Maybe I'll write an alert on that! "Urine Attack Alters ICS Sensor Readings!"” The network experts surely have their knowledge, and it’s valuable, but it also leads them to overlook the very real problems of interference with sensors. Consider Wiener’s original definition. Call it sabotage if you like, but a network with compromised sensors has problems too large to be dismissed. 

The depth of our engineering concerns run deep because the consequences can have lasting impact on millions of citizens without warning. As mentioned in my previous blog, devices such as analog amplifiers are commonly used by multiple vendors with no cyber security requirements. Defense-in-depth (safety) is based on the premise there will be redundant measurements in case of a failure of a critical sensor. However, if all of the redundant sensors use the same analog amplifiers whether they are measuring the same or different values (for example, temperature or pressure), safety has now been compromised because of a common cause failure. Fault recordings and appropriate sensor monitoring can be helpful in detecting this potentially devastating problem.

Lastly, people have often asked about the origin of my database of actual control system cyber incidents. Many of the cases come from attendees at conferences who hear my actual case histories and realize they have experienced similar issues. That is what occurred with the analog amplifier blog. The following should help address some of the doubts about the viability of compromising analog amplifiers.

- “I encountered a similar issue while deploying a 802.15.4 wireless mesh system across a few Oil and Gas refineries. The 2.4GHz band tripped the Arc Flash detectors and scrammed the unit every time we broadcast on 2.4GHz.  Only occurred on a certain model of arc flash detector.  The issue was the ADC was not sufficiently shielded for the nominal increase from the 2.4GHz and it induced an offset voltage at the digital convertor that was enough to trip the unit. (If this can occur unintentionally with 2.4GHz cellular, what can happen with compromised 5G networks?) Now years later in the industrial cyber-PHA process these radios will be tagged as a high risk due to the capability through nefarious means to shut the process down simply by turning on the SSID broadcast on the 802.11 band. The site firewalls are aware of this issue and they would quickly diagnose the radio broadcast causing the shutdown and disable the radios, but that would be after the uncontrolled shut down and the systems using the wireless mesh would be offline until they could correct the issue.  I would be surprised if anybody in the root cause analysis would look to a purposely induced trip, they would assume it was a mistake first.”  Recall that Stuxnet was viewed as a “mistake” for over a year because the Iranians hadn’t considered that physical damage could be a result of a cyber attack. This is essentially Clint’s view 10 years later.

- "During the last week of October 2012, PG&E noticed that several field SCADA devices (including systems and components supplied by GE operating from a transmitter location known as Round Top in the San Francisco Bay Area were experiencing an unusually high error rate. Malfunction of these SCADA systems hinders PG&E's ability to safely and reliably control its gas pipeline systems and electric power grid. On November 1, 2012 PG&E technical experts visited Round Top to diagnose the problems with the field devices and found interference with our SCADA master transmitter/receiver operating on licensed frequencies of 952.05625MHz (transmission) and 928.05625Mhz (reception)."

- Radio Frequency Interference (RFI) from digital camera maintenance workers were using to photograph the control panel shut down boiler feedwater pump controls leading to a nuclear plant scram (shutdown).

These are just some examples. As the engineers read this blog, I am sure they will think of many other cases to add. In each case, it was assumed these were unintentional cases, and in Clint’s case not associated with cyber. However, the only difference between malicious and unintentional may be intent as the impact would be the same. With new, potentially compromised technology such as 5G coming down the path, the threat landscape is expanding in directions that too often continue to be ignored.

Unless the culture/governance gap between network monitoring and engineering is overcome as Chris Sistrunk also mentioned, a sophisticated attacker can send us back to the 1850’s.

Joe Weiss