Practical improvements for the cybersecurity of your process operations, organizations

May 26, 2020

In this podcast, John Cusimano, vice president of industrial cybersecurity, aeSolutions, a consulting, engineering and CSIA-member system integrator in Greenville, S.C., talks with Jim Montague, Control’s executive editor, about the present cybersecurity terrain, and delivers practical recommendations that users can employ to improve the cybersecurity of their process applications, facilities and organizations.


Jim Montague: Hi, this is Jim Montague, executive editor of Control magazine and, and this is the latest in our Control Amplified podcast series. In these recordings, we talk with expert sources about process control and automation topics, and try to go beyond our print and online coverage to explore some of the underlying issues impacting users, system integrators, suppliers, and other people and organizations in these industries. For our podcast, we're talking to John Cusimano, vice president of Industrial Cybersecurity at aeSolutions. That's And aeSolutions is a consulting, engineering, and CSIA member system integrator in Greenville, South Carolina. John has been a perennial source for Control magazine on cybersecurity, including our 12 Days of Cybersecurity miniseries this past December.

Well, John, sorry for the usual preamble, and thanks for joining us today.

John Cusimano: Thanks, Jim. Happy to be here.

Jim: Okay. First of all, since cybersecurity threats and solutions are evolving so rapidly, what's been happening in the past few months?

John: Well, in many ways, you could say it's more of the same in that we continue to see threats, vulnerabilities, and impacts or consequences, or I guess you could say incidents that are all following along the same lines as we have been seeing for the last couple of years, but just more and more frequent. So for example, new threats we're seeing most of them are in the malware category, and most of those are in the ransomware category. So, new ransomware like Ryuk, DoppelPaymer, LockerGoga, Maze, Snake, lots of interesting names that are generally all focused on some form of ransomware encrypting files on computers. But those threats, many of them have been directed at industrial facilities. And I'm gonna talk about some of those incidents in a moment.

The vulnerability side of things, so how do those threats exploit the systems? It's usually through some known vulnerabilities. In the recent past, it's been related to the EternalBlue vulnerability with SMBv1 that was the cause of the now infamous ransomware like WannaCry and NotPetya. More recently, we're seeing them based on some different vulnerabilities, like one called BlueKeep RDP vulnerabilities. And then Microsoft has announced vulnerabilities in SMB version 3. So it was SMB version 1 that was kind of the culprit for the WannaCry and NotPetya related ransomware. So we can expect that now that Microsoft has announced that there's vulnerabilities in SMB 3, and as I currently don't think there is a patch for that yet that, we will see attackers take advantage of those. So that means that there have been incidents, and that's really what matters. You know, there can be threats, there can be vulnerabilities, but it's where those come together to cause an incident.

So, some of the bigger incidents in recent times, a UK-owned company called EVRAZ has been the victim of ransomware. They have operations in Russia, Ukraine, Kazakhstan, Italy, Czech Republic, and also here in the U.S. and Canada. Their operations were paralyzed across Canada, and in the U.S. most of their manufacturing operations shut down. And this one was blamed on the Ryuk ransomware.

Jim: Okay. Well, I was just gonna add, when we spoke last fall, you emphasized that the Triton/Trisis safety system malware shook up users who previously thought their safety systems were secure. And I just wanted to know now, are they continuing to wake up and how have they been responding lately?

John: Yeah, that's been interesting to see the response to that. So, fortunately, the world hasn't seen any variants or new safety systems attacks, at least not that I'm aware of. And I follow this pretty closely. But the asset owners have definitely been responding to the threat of the Triton/Trisis or any subsequent variations of that. So where I've personally seen it is we've been asked to perform a lot more safety system cybersecurity assessments, particularly around the requirements in 61511, or I should say IEC 61511, which is the Functional Safety Standards. So it's not a cybersecurity standard, it's a safety standard that was modified in 2016 to add requirements that any safety system must be evaluated for cybersecurity as well.

On top of that, we've seen companies that actually have gone back to their internal safety system standards and add that requirement internally. So they're basically adopting these two new requirements in 61511 that came out in 2016. Which, just as a reminder, that actually predates when Trisis and Triton were discovered which was late 2017. So the standards went into effect in 2016, saying you really should, or must, shall, I think they're shall statements, "You shall assess the security of your safety instrument systems." Late 2017, we actually saw a very compelling reason to do so. And I'd say in 2019 and ongoing, companies are adopting that and actually changing their internal standards and making it a required practice. I'm particularly seeing it for greenfield, when companies are putting in new safety systems or upgrading their safety systems, they're adding that as a requirement to the engineering team to perform cybersecurity assessments.

Jim: Right, so folks are doing some of the things they maybe should have been doing all along, but it's very helpful that it seems that they're doing that. And then thinking about cybersecurity as part of a safety strategy is a very popular route to take for a lot of people. Because, conceptually, it makes it easier to approach I guess, right?

John: There's very compelling reasons to do so, right.

Jim: So, you know, some of the things we talked about earlier are like are key switches and other physical measures continuing to grow in popularity, in addition to the software remedies? And how can users approach those physical measures, especially if they were initially assuming that cybersecurity just meant using software only?

John: Right. So there are certain things we find when we do these security assessments of safety systems. There are certain risks that get uncovered. And we're not...just to be clear, when you do a security assessment of a safety instrumented system, I'm not just talking about the controller, not just the Triconix controller or Siemens controller. I'm talking about the entire safety instrumented system, which includes sensors, logic solvers, actuators, and all of the computers and networks around that. So that's what's in scope when you do this. And so the kind of things we find, typical high risks we're finding is in how companies are implementing safety bypasses and overrides, and how they're integrating smart instrumentation. So safety bypasses and overrides, it's common that there are times usually for maintenance reasons that you may need to temporarily bypass a safety instrumented function. Or actually more appropriately, a single device like if you're voting three pressure transmitters, two out of three, one of the pressure transmitters has some kind of an issue. You want go do work on it, you typically would put it into bypass and that would take it out of the vote.

But it's in how companies are implementing these bypasses, what technology are they using? What access control is required? What authorization levels? Can one person do it? Or does it require a separation of duties where two different people have to participate in order to put the function into bypass? Because when you're bypassing a safety function, you're temporarily disabling that safety function, or at least degrading it. So you need to do it carefully. It needs to be monitored, and you need to make sure that you remove the bypasses when the work is done. That's a big area that we're finding risk in how that's implemented. Is it a button on the HMI that the operator can press? Does it require some elevated levels of privilege? Is there a key switch involved? Can one person do it or does it require two people? Can it be done all remotely or does somebody locally in the plant have to participate?

And the solution to that usually is some combination of better access control, better monitoring and many times some physical measures like key switches that are either in the field or in the control room, that somebody that is in the plant is participating in either putting that safety function into bypass or removing it. There's a lot of different solutions, but most of the time, the ways to reduce the risk are relatively inexpensive. They just require a little bit of engineering and many times some operational procedural changes.

Jim: Right. So what you're talking about really is a lot of the good basic practices of hygiene that are logical and should be well known already, right?

John: Should be but oftentimes, it's not until you take a look at the big picture and study that the safety system and the networks and computers around it, that you're able to see where there are gaps or vulnerabilities in the design where people make a lot of assumptions about how things are supposed to operate. But until they do a study like this they're not seeing, "Oh, yeah, but a malicious person, or a malicious user, a malicious software might be able to do the same thing that I want my authorized user to do." But they'll recognize that you may not have put the controls or countermeasures in place to prevent an unauthorized user from doing the same thing.

Jim: Right. But the remedies are easily accessible once you have that overall view, I guess, right?

John: They usually are. Yeah, you can engineer a solution without necessarily spending a lot of money.

Jim: Cool. Was there any other underlying cybersecurity issues or remedies that have emerged or become more prominent recently?

John: Well, I touched on instrumentation a moment ago and that's another area that needs to be studied a little more carefully. So smart instrumentation is great—HART, Foundation Fieldbus, digital instrumentation—but how you integrate that into either your basic process control or your safety systems can have unwanted cyber exposure or vulnerabilities. A good example is smart transmitters or smart valve positioners that are part of, could be either basic control or safety function, but let's take safety function to stay on that same theme. So say you've installed some smart, I'll stick with pressure transmitters, so smart pressure transmitters in a safety function and they can be connected to an asset management system for maintenance purposes for instrumentation texts to be able to maintain those instruments and they can do so remotely.

What can often be overlooked is the fact that that instrument is part of a safety function. And there are certain parameters that if they are changed, like for example, the range on the transmitter. If those parameters are changed, it will change the safety function. So, you don't want those inadvertently changed. And if the instrumentation can talk to an asset management server that is usually sitting up at Level 2 or 3 in the control system hierarchy, Purdue model, that you could actually be implementing change at Level 0 in your safety strategy from a device at Level 3 with many times no additional controls or barriers other than access to that asset management server. So that needs to be looked at.

And again, there are remedies to that, simple things like most instrumentation products do have an integral key switch or actually usually a DIP switch that you can put it in, so it's in a write-protect mode and you cannot change critical parameters remotely. That's one solution. Other solutions might be to put those instruments in a separate group with additional access controls or even a separate asset management client for those devices. Again, you can engineer solutions, they're not typically all that costly. You just have to recognize the risk in the design and make some appropriate changes.

Jim: And safety systems have always merited and required an additional level of awareness beyond the regular control. So this, a lot of this should be familiar to, or at least approachable by folks pursuing cybersecurity as well, correct?

John: Yeah, that's correct. It's an interesting situation because in many cases, it's the technology that's enabled us to do some really great things. But it has introduced vulnerabilities that weren't obvious to the people that installed them. You know, we didn't go there, but remote access is another technology now that's readily available. There's lots of ways that you can implement remote access. And again, you have to make sure that when you're introducing remote access to your control systems, that you're doing it in a secure manner and that you should go even an extra step on your safety systems. And many companies just say there is no remote access to safety systems. Others do it but they want to have at least one additional layer of protection, if not more.

Jim: Then, getting a little bit more into the human element side, how can control engineers and other operations technology personnel or OT personnel understand IT's security priorities and policies? And then how can IT appreciate process operations and safety priorities? And also, how can they learn to work together on cybersecurity? Is it just better leadership by senior management or what's needed?

John: Yep. It's an ongoing challenge that the industry faces. And the interesting thing that I found is that industrial cybersecurity is the thing that is forcing IT and OT to work together. Because it's a problem that neither group independently can solve, they need to work together. You need to have people that understand IT, and networks, and servers, and domains, and remote access technology and so on, network segmentation. And then you need automation and operations personnel that can correctly apply those technologies in a way that is still appropriate for operations. In other words, you can't just turn over all of that to IT, because IT personnel typically don't understand the operational constraints in the operations environment. But the automation people don't have the skills many times to implement. So it's a forcing function. These groups need to work together, and we're seeing more and more a breakdown of those barriers.

The three things that come to mind for me in terms of how to bring these disciplines together, training is one and there's lots of training out there focused on ICS cybersecurity. Most of that training is designed for either teaching IT people about OT and how to secure OT, or teaching OT people about IT and IT security concepts. The second one is to just get people working together and collaborating on projects, get IT involved early. That's the biggest mistake I've seen companies make is that moving along on a big automation system upgrade or putting in a new greenfield system and at the 11th hour somebody says, "Hey, we should probably secure this." So the right thing, of course, to do is to get cybersecurity people involved early in the project and throughout the project to make sure it gets designed in.

Jim: Well, bolt-on is nowhere as good as baked-in I think everybody's saying, right?

John: Right, build security in is a motto out there that people say. And my last and third point on getting people to work together that I've found to be very effective is these cybersecurity risk assessment exercises. We call it CyberPHA at aeSolutions. They are designed to be cross-functional in nature because no one person in one discipline can really understand how the system is constructed, how it operates, where the vulnerabilities are, what the impacts could be if the system was compromised, and therefore what are the risks. When we do a CyberPHA, it is a team effort. It's cross-functional in nature. We facilitate and guide the process. But it's all around getting those disciplines together usually in the same room, and in coronavirus times' it's probably gonna be remote and that works as well. But getting people all talking about the same thing, about typically we look at it zone by zones, we talk about what are the threats? What are the vulnerabilities? What are the consequences? And you need everybody's perspective on that to get a realistic answer and to get buy-in on what things need to change to achieve tolerable risk.

Jim: Yeah. And also on the teaching front, I know we talked earlier about aeSolutions recently partnering with the SANS Institute to offer the one-day class ICS cybersecurity for managers, which is based on applying the NIST Cybersecurity Framework and the ISA/IEC 62443 standard. So how's that effort going lately? And then how's the focus or curriculum been updated at all recently?

John: Going really well. So we just ran the course for a second time on March 4th at the SANS ICS Summit in Orlando, just in time before all flights were grounded. The class was full except for a few that just couldn't make it due to their company's travel restrictions at the time. But we got great reviews from the students and we did make some improvements from the...the inaugural class was in October of last year in Houston. And then the second one was in March. And SANS has decided to run the course again at their fall summit in Houston. And we're actively working on making that course available online very soon.

Jim: There we go. Netflix and YouTube.

John: You can imagine what one of the drivers is.

Jim: Exactly, exactly. Well, any avenue is a good one, as long as it can get across. I'm glad to hear that it's succeeding well. And yes, online, and Netflix, and whatever would be a fine especially may be that people have some extra time on their hands perhaps to update. And during the recessions or when there's not as much business people do have time to retool and relearn, and this could be a good opportunity for them.

John: That's right. That's absolutely right. 

Jim: Okay. Well, John, listen, that was a fine update on cybersecurity. A few weeks or months have gone by, and I was like, "Man, I wonder what's happening." So now I know, and thanks for the update today.

John: Oh, great. Yeah. Thanks. Thanks, Jim, for having me and for hosting the podcast.

Jim: Terrific. Okay. This has been, as we always say, another Control Amplified podcast. I'm Jim Montague. Thanks for listening, and please remember that Control Amplified podcasts are available on most podcasting apps, such as Apple Podcasts and Google Play, and of course, always Control magazine's YouTube channel. Plus, you can just listen at at any point. And thanks again for listening, everybody.

For more, tune in to Control Amplified: The Process Automation Podcast

About the Author

Control Amplified: | Control Amplified: The Process Automation Podcast

The Control Amplified Podcast offers in-depth interviews and discussions with industry experts about important topics in the process control and automation field, and goes beyond Control's print and online coverage to explore underlying issues affecting users, system integrators, suppliers and others in these industries.