Forgive me for indulging feverish thoughts during this winter’s shortest and darkest days in my windowless office, lit only by ghastly fluorescents and the dim pixels of a laptop LCD. You see, the phone keeps ringing and displaying “Joe Weiss,” and when I answer, Joe explains again that the problem with industrial cybersecurity isn’t only where everyone is looking—at the Internet, the networks, the gateways, the traffic, the bad and accidental actors who would get on the system, steal intellectual property, shut down the control or safety functions and paralyze or explode the facility.
Joe blogs as “Unfettered” on ControlGlobal.com, and says we also must solve the integrity of the sensors themselves. No matter how sure we can be that the system is secure—that no one can or will effectively breach it and interfere with safe, productive operations—we are not immune if an ignorant, mistaken or bad actor—or bad sensor—can provide a signal that looks trustworthy but isn’t. If a field device can look normal, and use normal network traffic to lie to the system, none of the usual cybersecurity precautions will detect it, prevent it or alert us to the event.
It reminds me of concerns raised before our last U.S. national election about the security of the voting process. It’s been demonstrated that relying on the Internet or electronic voting machines leaves open the possibility of interference by cyber attack, malfeasance by agents in the political system or at least, malfunction and loss of voting integrity. The responsibility is widely distributed to local governments, and at least where I live, they are taking it seriously. Now and for the foreseeable future, vote counts will be supported by paper ballots and paper voter logs and registrations. Hoosiers don’t mess around with this stuff.
But like the outputs of a process control system, the votes themselves are the product of the information—the data—taken in by the voters. Some of you may have noticed I’m old and set in my ways (editing a print magazine, for example), and I get almost all my political information the old-fashioned way, from network TV, established major newspapers and magazines—the mainstream media. Others may subscribe to digital news feeds and e-newsletters, Twitter, Facebook, etc., but I feel no need.
In a way, my sources are kind of like a traditional, closed DCS. My sensors might be misleading me, but they’re brand-name, mostly reliable and fail in ways with which I’ve become familiar. If something seems off, I know how to cross-check them and when to ignore them. On the other hand, they may not be capable of telling me everything I need or ought to know.
Compared to my DCS sensors, those digital stories from diverse sources are more like information from the cloud, the Internet and my e-mail in-box. They’re mostly trustworthy, but many are not and some are just plain malicious. The risk of getting false information or worse, information designed to interfere with my decision-making process is real and, I think, more subtle than traditional media.
Whatever your source of information, you are more or less subject to interference in the feed. Like a DCS, if you get false, biased or bad information, your decision may be wrong. When that happens in a plant, you’ll recognize the symptoms when the process shuts down or someone is hurt or killed. But will you understand the cause?
In his recent keynote at the Texas A&M Instrumentation and Automation Symposium, Joe said, “The IT idea of prevention may not be adequate for the industrial control system (ICS) environment, and consequently, resilience and recovery become very important. An interesting adjunct is the concept of a cyber Pearl Harbor. Will there be one? Possibly. However, because of the lack of ICS cyber forensics and adequate training, we may not know it is cyber-related."
Good advice for process control, and for the next election.