CG1012_Cybersecurity

You Get the Security You're Willing to Pay For

Dec. 6, 2010
Since the Stuxnet Worm Hit, Vendors and End Users Alike Are Thinking and Talking About Their Security Policies
By Walt Boyes, Editor in Chief

It's been a tumultuous last few months for industrial control system security. Since July 15, when word of the Stuxnet worm hit, vendors and end users alike have been thinking and talking about their security policies.  At the Invensys Operations Management user conference in October, an "all-star" panel discussed cybersecurity. Members included Ernie Rakaczky of Invensys, Tyler Williams of Wurldtech, Marty Edwards of Idaho National Laboratory, Tim Roxey, CTO of NIST, Peter Kwasion of Shell, and Charles Ross of McAfee Security.

The panel had very few comforting words for end users, who want their vendors to take care of all these nasty little security problems. The theme was, if you want that, tough noogies.

Rakaczky said, in his view, users need to take up to 65% of the responsibility for securing their systems, while vendors should be responsible for 15%, and 20% should be "co-shared."

Those numbers are probably right, assuming the vendor is doing all it can to ensure that its control system hardware, firmware and software is as secure as it can make it, said Williams, whose company, Wurldtech, makes the Achilles testing suite.

[pullquote]Malware is growing by 800% year over year, and 2010 surpassed all of 2009 by April, said Ross. Malware is a big business, and highly trained professionals are producing it. Stuxnet, for example, is what Ross called "the first advanced persistent threat (APT)" to control systems. It isn't hard to imagine the payload of Stuxnet being modified for products other than those of Siemens.

Rakaczky was asked if IOM would survive a Stuxnet style attack, and he said, "Maybe." However, you shouldn't think this is a flaw for Invensys. The truth is that the hard words from system vendors are valid.

Tim Roxey from NERC gave a mea culpa about some power companies, who have attempted to go for compliance to the NERC CIPS rather than go for added security. He said that NERC's position is that compliance is only the baseline for adhering to the NERC CIPS. What's needed is a security culture at every critical industry.

German security expert, Ralph Langner, and others estimate that it cost over $1 million to write Stuxnet. If somebody is willing to spend that much to create malware, you need to think about what they intended to do with it. And don't pat yourself on the back if you don't have Siemens PLCs because Stuxnet has proved that an attack vector doesn't have to be network-centric and can affect individual control devices on the plant floor. We are likely to see variants aimed at other vendors relatively soon. If that doesn't scare you, you aren't paying attention.

So, end users and asset owners, here's your challenge: Make sure you have a security culture. Get the C-suite not only on board, but enthusiastic and understanding about what has to be done. After all, installing a security culture is inexpensive insurance.

Next, you have to organize as part of what Rakaczky called "a community of concern," and work with vendors and other end users to get a set of common practices for devices, systems and the way they are integrated in the plant. They may all be tested, but they can still be used in an insecure way, eliminating all the benefit of specifying and purchasing "secure" devices. Specifying adherence to ISA-99 or the WIB specs are giant steps in that direction.

You will have to be firm with your vendors and with your integrators, and insist on secure implementation of secure-by-design systems and devices. No caving when they tell you that will cost lots extra—or you'll get what you are willing to pay for.