Security and Safety with blood and guts

Be afraid, be very afraid...and pay attention, Grasshopper! Here's a book review that is sorta out of everybody's field. I think you should all read it. It puts into perspective what we are all trying to prevent. I just finished reading a monograph (a nearly 300 page monograph, at that) by an online friend, Chuck Stewart, who is an emergency medicine physician. The monograph is Weapons of Mass Casualties and Terrorism Response Handbook, by Charles Stewart MD, FACEP, published under the auspices of the American Academy of Orthopedic Physicians by Jones and Bartlett Publishers (ISBN 0-7737-2425-4). You can buy the book at Amazon: http://www.amazon.com/Casualties-American-Orthopaedic-Surgeons-Monograph/dp/0763724254/ref=sr_1_1?ie=UTF8&s=books&qid=1218408387&sr=8-1. Here's the review I just posted on Amazon: Although I write science fiction, and help edit two science fiction magazines, my day job is as editor of a major technical publication...Control magazine (www.controlglobal.com). I have been studying this subject from the other direction...from the direction of the terrorist attack on the control systems of critical infrastructure industries. Chuck Stewart's book is an absolutely terrifying look at what might happen if one of those attacks, or God forbid, more than one of those attacks might succeed. As you might expect, some of the deep medical content is over my head...I am an engineering type, not a medical type, but some of it isn't. If you are interested in the security of the critical infrastructure of the world, you ought to take a short detour through Chuck's book. What happens if terrorists (foreign or the home-grown variety) blow up a refinery, and then do a poison gas attack on the responders and victims? Chuck knows, and lays it all out in his book. This book is highly recommended for any serious researcher in safety and security, for anyone who works and manages in the critical infrastructure industries, for government types like DHS and FBI, and of course, for the medical professionals Chuck wrote it for.
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

  • <p>Walt,</p> <p>When you say things like "blow up a refinery" it suggests that some software fault (eg caused by some hacker) might have the capability of doing that. But as you know the ultimate protection, and a great deal of effort goes into it, is at the lowest physical level possible, relief valves for example. And hard wired logic, high integrity safety systems etc. I had this argument over Y2K many years ago. Don't you think you may be feeding the trolls? Francis www.controldraw.co.uk</p>

    Reply

  • <p>No, I am not feeding trolls. Francis, I saw a live demonstration of a hack against an SIS system last week. It took 26 seconds to cause the valves to fail open. The danger is in fact real.</p>

    Reply

  • <p>More details please Walt. My mind boggles that anyone could engineer an SIS to permit such a hack, and how such an SIS could be even called a safety system. And does the situation not imply that a failure in the SIS (hacked or not) could open the valves? So how can it be called an SIS? Francis</p>

    Reply

  • <p>Your guess is as good as mine. Fact remains, this product is being sold as a SIS. I do not know the vendor. Anytime a SIS is connected to the plant network, it becomes open to an attack. Nearly all PLCs, including safety PLCs are vulnerable to DoS attacks unless properly firewalled. I have not much more information, because the demonstrator was unwilling to share too many.</p>

    Reply

  • <p>Walt Did you get the comment I made yesterday? Comments do not appear quickly on this 'blog' (they do on mine) and I did not save what I said, it was something like How can my guess be as good as yours - you were there and I was not, you are the reporter so please report the entire event. With names and details.</p> <p>I still do not believe it. It means that any software failure, not just hacking, could cause the valves to open. That is not an SIS nor even a half decent control system. Of course, anyone who connects an SIS (or any control system) to the net without firewalls etc is an idiot who deserves what they get. </p> <p>So please report the details, not just what some suit presented. Francis</p> <p>www.s88control.blogspot.com</p>

    Reply

  • <p>No details are available. No suits were involved. If you don't believe it happened, I'm sorry.</p>

    Reply

  • <p>Walt I would like to briefly explain what I would expect to find in an SIS, and indeed what I thought they did. They used to have these things. 1 A physical switch, in fact a key switch. Only when the key has been inserted and turned to the Allow Program Change setting should it be possible to change the software that can open valves etc. 2 The control logic must run completely independently from the network that it is connected to. Yes the controller should expose the values of its variables, and IO to the network, but read only mostly. The network should only be able to write to the SIS values that cannot override the safety. Reset Alarms might be an example. The network should not be able to write to the Outputs. I really thought that this sort of protection, and much more, were defined as paramount aspects of SIS design. I can see no reason why it cannot be done, it is not difficult, because it is the sort of thing that you can do in dedicated Process controllers. It impractical for PC’s, but nobody would use a PC for an SIS – would they? As I said earlier, I worked on Y2K - I found that the people selling the disaster scenario had no idea about what a process controller is. The ones making the most alarming statements tended to wear suits. Francis</p>

    Reply

  • <p>Francis, I was there too. The demonstration PLC was in a black box because the security firm was working with the vendor to correct these problems. The people performing the attack are well known to anyone in the industrial security business.</p> <p>Suffice it to say that this this was not some esoteric or fancy programming hack. In fact, it was so simple, that most of us who understood what it was were shaking our heads in amazement. I won't say any more than this because I don't want to give anyone a head start at hacking these things. </p> <p>You need to understand that SIS does not deal with security. It is focused almost entirely on reliability. This is one reason why my skin crawls every time I see someone marketing an SIS controller that sits on a network.</p>

    Reply

  • <p>I totally agree about connecting an SIS to a network, but even so did this demonstration PLC conform to the requirements I expressed in comment 7? A physical key switch etc? It does not sound like it. And it being hidden in a box means you could not even see if the switch (if it has one) was in the right position. Was this PLC certified for use in Safety applications? If so then it Should be named, or at the very least the supplier should be sending out critical advise to all users. If it was not then it should not be used in an SIS.</p> <p>As for an SIS being focused almost entirely on reliability I beg to differ, they are focused first on safety then on reliability. So when they fail they fail safe.</p> <p>Don't get me wrong, of course systems have to be critically evaluated for security when networked. But I still think Walt’s original "blow up a refinery" statement is extrapolating a genuine issue to circumstances way beyond the test case. Francis</p>

    Reply

  • <p>Of course I am extrapolating. But it isn't a very far reach. Yes, the safety system in question is being sold, and yes, it is TUV certified. We aren't going to reveal whose it is, or even whose it isn't, because the vendor is actively trying to remedy the situation. That's fair. The system was supposed to fail safe. It did not. And as Jake points out, it was a very simple hack.</p>

    Reply

  • <p>Walt, It sounds like TUV (and perhaps all the certification bodies) should be re-evaluating their certification process. Is this happening? Furthermore, TUV may have certified the system, but was that an error in their process or in the standards they were working to?</p> <p>Also, of course the safety depends on more than just a certificate From <a href="http://www.instrument-net.co.uk/safety_integrity.html">http://www.instrument-net.co.uk/safety_integrity.html</a> "It must be acknowledged that the achievement of TUV Class is not absolute. Due to the complexity of PES, all TUV certifications are rewarded based on particular design, diagnostic, operational, testing, and maintenance restrictions. These are documented in the certification report from TUV. All PES have restrictions for TUV Class 5 and 6. Some of these restrictions can result in the requirement that the PES operate in a configuration that is different from the advertised product. These restrictions must be examined carefully to ensure that the PES meets the required TUV class in the configuration that will be used in operation."</p>

    Reply

  • <p>Yes, I think TUV should be re-evaluating their certification process, and it is one of the things I'm going to be talking about when I keynote their Safety Symposium in Cologne in two weeks. I don't think it is an error in their process. I don't think it is an error in the standards, exactly, either. I think the problem is that we are now dealing with meta-systems...systems of systems...that complexly (is that a word?) interact, and we don't easily deal with sorting out all the interactions and the potential for harm therein.</p>

    Reply

  • <p>One final thought regarding this situation: How do we find these controllers and ensure they're patched properly after the company comes up with a fix? Or are we simply going to treat this as if it was a car from several years back and only make the fix available to those who know how to ask for it? </p> <p>There are no good answers here. In many cases the controller gets purchased by one party, installed in a product made by a second party, delivered to a general contractor on a skid by a third party, and then turned over to a fourth party for operation. </p> <p>Good luck figuring out where these controllers actually are, and then who needs to be notified...</p>

    Reply

  • <p>I have an Allen-Bradley Control-Logix PLC in a SIL-2 application. There is no way to hack it. You have to go through two firewalls to see it. The key switch is in run mode, you have to move it to alter the program. Moving the key switch out of run mode, trips the system to a safe state.</p>

    Reply

RSS feed for comments on this page | RSS feed for all comments