Users get the security they're willing to pay for...and not any more. #pauto #cybersecurity #opsmanage #NERC

Oct. 26, 2010

A lot of this will be in my December Editorial:

 This has been a tumultuous last few months for industrial control system security. Since July 15, when word of the Stuxnet worm hit, vendors and end-users alike have been thinking and talking about their security policies.

A lot of this will be in my December Editorial:

 This has been a tumultuous last few months for industrial control system security. Since July 15, when word of the Stuxnet worm hit, vendors and end-users alike have been thinking and talking about their security policies.

At the Invensys Operations Management user conference, OpsManage2010, in October, there was an “all-star” panel discussion of Cybersecurity, including Ernie Rakaczky of Invensys, Tyler Williams of Wurldtech, Marty Edwards of Idaho National Laboratory, Tim Roxey, the CTO of NIST, Peter Kwasion of Shell, and Charles Ross of McAfee Security.

The panel had very few comforting words for end-users who want their vendors to take care of all these nasty little security problems. The theme was, if you want that, tough noogies.

Rakozsky noted that in his view, users need to hold up 65% of the responsibility for keeping their systems secured, while vendors (like Invensys) should only be responsible for 15%, and another 20% should be “co-shared.”

That’s probably right, as a division of responsibility, assuming the vendor is doing all they can to ensure that their control system hardware, firmware and software is as secure as they can make it. Tyler Williams, whose company Wurldtech, makes the Achilles testing suite, made that point firmly.

Charles Ross shared some very scary statistics. Malware is growing by 800% year over year, and 2010 surpassed all of 2009 by April. Malware is a big business, and highly trained professionals are producing it. Stuxnet, for example, is what Ross called “the first Advanced Persistent Threat (APT)” to control systems. It isn’t hard to imagine the payload of Stuxnet being modified for another company’s products other than Siemens. Rakaczky was asked if IOM would survive a Stuxnet style attack, and he said, “Maybe.”

But you shouldn’t think that’s a flaw in Invensys. The truth is that the hard words from system vendors are valid.

Tim Roxey from NERC gave a mea culpa about some power companies who have attempted to go for compliance to the NERC CIPS rather than go for added security. He said that NERC’s position is that compliance is only the baseline for adhering to the NERC CIPS. What’s needed is a security culture at every critical industry.

Lat year, at AutomationXchange, one end user reported, “I was completely surprised that somebody would attack us. All we do is make (snack foods)! But the attacker is in Federal prison right now.”
It’s been estimated by Ralph Langner and others that it cost over $1 million to write the Stuxnet worm. If somebody is willing to spend that much to create malware, you need to think about what they intended to do with it. And don’t pat yourself on the back if you don’t have Siemens PLCs, now, because Stuxnet has proved that an attack vector doesn’t have to be network-centric and can affect individual control devices on the plant floor. We are likely to see variants aimed at other vendors relatively soon.  If that doesn’t scare you, you aren’t paying attention.

So, end-users and asset owners, here’s your challenge. You need to make sure that you have a security culture. This is only done by making sure that the C-suite is not only on board, but enthusiastic and understanding about what has to be done. After all, installing a security culture is inexpensive insurance.

Next, you have to organize as part of what Rakaczky called “a community of concern,” and work with vendors and other end-users to make sure that there are a set of common practices for devices, systems, and the way they are integrated in the plant. Systems, products, network devices may all be tested (using something like Achilles, for example) but they can still be used in an insecure way, which eliminates all the benefit of specifying and purchasing “secure” devices. Specifying adherence to ISA-99 is one giant step in that direction.

This means that you will have to be firm with your vendors, and with your integrators, and insist on secure implementation of secure-by-design systems and devices. No caving when they tell you that will cost lots extra—or you’ll get what you are willing to pay for.