Rockwell's Doug Wylie says, "The sophistication level of Stuxnet is one that customers hadn't anticipated in the past. They're looking to vendors for leadership here. It has provided affirmation to vendor companies putting together control system solutions that investments in incidents response and designing in systems-level security are worthwhile. It helps them justify their investments in these things."
The Buck Stops Here
That brings us to the hard truth that applies to all control system users: Good cyber security begins at home.
What should you be doing in response to Stuxnet? The answer is both simple and not-so-simple. Look to your own security.
Begin by asking how secure you are now. Then talk to your system vendor(s). They've built your systems, after all. Who better understands the best way to secure them?
Look to standards—ISA99, IEC SC65C WG13, NIST and others—for help. If you're in a "critical infrastructure" industry, there are government guidelines. The guidelines for the power industry from the North American Electric Reliability Corp. (NERC), the Dept. of Homeland Security, and the Chemical Facility Anti-Terrorism Standards (CFAT) provide some regulatory benchmarking.
Don't let the multiplicity of standards confuse you. Most are quite similar. "If I take them all—ISA, NIST, NERC- CIP, etc.—they all have same framework. Use a little bit of all of them," says Invensys' Rakaczky.
Cusimano adds, "Get a copy of ISA99.02.01 (March 2009, 'Establishing a Cyber Security Management System.' It's directed at control system users. It takes a lifecycle approach addressing risk assessment, policy and procedures. It's also industry-independent, but there are good documents from specific industries. They can be cross-referenced. There's no topic in one that is not addressed in the others."
Get outside help if you think you need it. Cyber security firms and consultancies with expertise in control systems can be a real help here.
The not-so-simple part of the answer is that cyber security is not just about Stuxnet.
Cyber security is about culture change—one of the hardest things to pull off in any organization. The CEO or someone on the board is going to have to make cyber security a priority and make it someone's job—complete with accountability—not just another duty tacked on to the control room operator's task list.
Getting the attention of the executive suite on any subject not related to next quarter's profits is a challenge, but Stuxnet's emergence may have made that easier. "One of the main takeaways," says Rockwell's Wylie, "is that there was a risk of loss of control and loss of intellectual property. That's an attention-getter."
Still, selling security is a tough gig. If the system works, you literally have nothing to show for expenditures, because nothing has happened. Says Wylie, "Security conversations are always short: ‘We haven't been attacked yet.'"
He adds that the magic word for selling security to upper management is "uptime." Management wants uptime, availability and reliability in their systems, and they can't have those without security (and safety).
Brad Hegrat adds, "You can't regulate due diligence, but executives do care about uptime. There's the reputation factor as well and the possibility of a loss of public confidence. Accidents happen, but if I have to explain to the board why a digital security failure happened, the subject of negligence is apt to come up."
Good cyber security means developing a whole way of thinking about behaviours and operations. It means sorting out the knotty issues when IT and control engineering work together. It will take homework, and it will require on-going training and vigilance.
What the Stuxnet affair has taught us, says Hegrat, is that in terms of security, "the entire enterprise [must] be treated collectively. It's going to be a full-on requirement from now on. [With Stuxnet] every last digital protection mechanism that you'd have deployed would have failed because this was a piece of targeted code that required human intervention. What this tells you is that no matter how secure the system is, if you don't have people who are properly trained, etc., you're not secure. On the flip side, a less secure system with properly trained people is better off."
Changing the way your organization thinks about security is a daunting task, but there is a model—safety.
OSIsoft's Owen says, "One of our executive VPs told me that 25 years ago, you'd get safety bulletins in morning meetings about fatalities. News was shared across the plant, and the culture did change slowly. Security is going to be the same kind of thing. These changes are cultural. It does take awhile. I'm looking forward to the time when we're on top of this stuff instead of being reactive."
The question is how much time? Stuxnet suggests that time to let a security culture evolve slowly may be running out.
Nancy Bartels is Control's managing editor.
End of COTS and the USB Stick?
The Stuxnet virus has exploited the vulnerability of using a commercial off-the-shelf (COTS) operating system for control and one of the most convenient, ubiquitous tools available now, the USB stick, in process automation operations. One obvious solution is to go back to using only closed systems and banning USB sticks from the control room. But is either option viable?
John Cusimano, director of security services at security services and certification vendor exida (www.exida.com), doesn't think so. "Momentum for open systems is too great. Going back is almost not an option," he says. "We're far too dependent on being able to move data around throughout the organization so we can make good decisions and optimize processes. It may slow down a bit, but it won't stop. Nor is the drive to invest COTS systems going to stop. The productivity and technical benefits are too great. Except for the most conservative industries, such as nuclear, most will continue to use them."