Interested in linking to "Cybersecurity in Your Safety DNA"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
"In many languages, there is only one word for safety and security. In German, for example, the word is 'Sicherheit;' in Spanish it is 'seguridad;' in French it is 'sécurité;' and in Italian it is 'sicurezza.' " That's the start of the 2010 article by John Cusimano Director of exida's security services division and Eric Byres, CTO of Byres Security. Both Cusimano and Byres have significant expertise in both safety and security in process plants. Their article was titled, "Safety and Security: Two Sides of the Same Coin." They were introducing a relatively new concept that grew out of the similarity between layers of protection analysis (LOPA) for safety instrumented systems (SIS) and the defense in depth (DID) strategy for cybersecurity in industrial control systems.That was three years ago. Something has changed in the process industries, but not every top manager or plant manager or plant or corporate IT executive has seen the ramifications of it.
That "something" was, of course, the discovery of the infamous Stuxnet malware, which infected an Iranian uranium enrichment plant and damaged or destroyed over 100 special-purpose centrifuges. Morteza Rezaei, an Iranian automation professional, says, "Main affected country in the early days of the infection was Iran, so I could find many infected projects easily."
In case you've been asleep for the past two years or not paying attention or had your head in the sand or your fingers in your ears singing, "La la la la, I can't hear you!," you will know something about Stuxnet. Here's a quick reminder of what it did, and why it is important.
In Nancy Bartels' cover story in October 2010 ("Worst Fears Realized"), Nicolas Falliere of security vendor Symantec says, "Stuxnet can steal code and design projects, and also hide itself using a classic Windows rootkit, but unfortunately it can also do much more. It has the ability to take advantage of the programming software to also upload its own code to a PLC typically monitored by SCADA systems. Stuxnet then hides these code blocks, so when programmers using an infected machine try to view all of the code blocks on a PLC, they will not see the code injected by Stuxnet. Thus, Stuxnet isn't just a rootkit that hides itself on Windows, but is the first publicly known rootkit that is able to hide injected code located on a PLC."
Falliere adds, "In particular, Stuxnet hooks the programming software, which means that when someone uses the software to view code blocks on the PLC, the injected blocks are nowhere to be found. This is done by hooking enumeration, read-and-write functions, so that you can't accidentally overwrite the hidden blocks as well. Stuxnet contains 70 encrypted code blocks that appear to replace some 'foundation routines' that take care of simple, yet very common tasks, such as comparing file times, and others that are custom code and data blocks. By writing code to the PLC, Stuxnet can potentially control or alter how the system operates."
The two fundamental takeaways from this, for managers and directors and all IT people working in manufacturing enterprises are, first, that network-centric cybersecurity planning works just as well as the Maginot Line, and second, that any control system in any industry is vulnerable to a Stuxnet-type attack, whether or not it is connected to an IT-serviced network.
There is a third fundamental point that must be made. Stuxnet used cyber means to attack a plant's operating control system and make it fail in a dangerious and unsafe way.
ISA84 (now IEC61511 and 61508) recognized the need for active functional safety programs in process plants. Many plants now have formal functional safety programs. They have re-evaluated their alarm management systems and have brought their SIS into compliance with the IEC standards. Luis Duran, a safety expert with ABB, puts it this way: "I see that plants and companies with a strong safety culture see safety as a core value positively affecting their economic performance."
Some companies are very far down the road to functional safety. The Dow Chemical Co., as Eric Cosman, co-chair of ISA99 and a security expert for Dow points out, has been working on functional safety since the early 1960s. What a lot of people don't know, he notes, is that Dow has had an active cybersecurity program since the 1990s.
"From my perspective," says Walter Sikora, vice president of security solutions of Industrial Defender, "safety is taken seriously, openly communicated and a high priority. Most utilities and plants have a 'safety moment' before every meeting to stress the point. Even before someone is allowed to visit a plant, they usually go through a safety training video. Very few, if any, companies do the same with cybersecurity. Have you ever visited a process plant and seen a big sign showing how many days since their last cyber incident?"
"It depends on the industry," says Joe Weiss, principal of Applied Control Solutions and chief blogger of Control's "Unfettered" cybersecurity blog. "The electric industry treats security as a compliance, not a reliability or safety issue. Other industries, such as chemical and petroleum, treat security as an important reliability and safety consideration. For example, consider the membership of the ISA99 Leadership Committee. The end users on the committee are primarily from oil/gas and chemicals, with no representation from electric utilities."