Industrial control systems are the heart of manufacturing worldwide. Every sort of manufacturing process from semiconductors to oil and gas and in between uses an industrial control system. Some are relatively simple, such as a PLC controlling a work cell on a factory floor. Some are more complex, such as a DCS in a refinery. Others are extremely complex, such as a SCADA system at a mine with more than 100,000 I/O points.
Like enterprise computing, industrial control systems have traveled a path from standalone systems to the modern, highly interconnected world of Ethernet, the Internet and cloud-based computing.
But while enterprise computing and even home computing started confronting cyber attacks a decade ago, the industrial control systems lagged far behind. One of the main reasons is that there is a significant difference between the asset lifecycles in the enterprise computing and the industrial control system spaces. Enterprises see nothing unusual about replacing all their systems every two to three years, but industrial control systems are designed and operated to be replaced every 20 to 30 years, and some are kept going to the end of the useful life of the plant itself.
So, while enterprise IT has managed to keep up with cybersecurity, anti-virus and network defense by continually upgrading its systems, most industrial control systems have relied on what has been called by many cybersecurity researchers, "security by obscurity."
All of these industrial control systems share a common flaw. "They are all highly reliable, purpose-built and very efficient," says Patrick Miller, president and CEO of EnergySec in Portland, Ore., a not-for-profit educational institution devoted to improving security in the energy sector. "However, most platforms are not secure. They were never designed with security in mind."
Beginning in the early 1990s, enterprise computing systems and networks were quickly connected to other networks and the Internet. Industrial control systems didn't begin to be connected to even their own enterprise networks until a decade later, and in many cases, the connections were done inadvertently.
The media, both popular and technical, have been discussing severe vulnerabilities like the Stuxnet virus, supposedly built as a cyberweapon by the United States and Israel to attack process plants in Iran. Another severely problematic vulnerability is Aurora, which destroyed a piece of electric grid infrastructure.
Marco Ivaldi, senior security advisor from @mediaservice.net, an Italian security researcher and "hacker," says, "I think there is still a long way to go to reach a proper level of security in either the process industries or the electric utility space." Ivaldi went on to discuss several extremely dangerous vulnerabilities in commonly used industrial control system components from a variety of vendors.
In addition, in July, Siemens self-reported three vulnerabilities in WinCC and Simatic Step 7.
And so it continues.
Almost every government has taken steps to try to do something, anything, to improve the security posture of industrial control systems because they are part of the critical infrastructure of a modern economy. It remains to be seen if any of those steps have actually done so.
Are We Any More Secure This Year Than Last?
"This," says Eric Byres, CTO of the Tofino Security division of Belden Inc., "is a tough question, because what we have happening is an arms race between the good guys and the bad guys. Both the vendors and the end users are slowly becoming security aware and are starting to provide and deploy good security technologies and practices. Unfortunately, the bad guys are also becoming more aware of the opportunities to attack industrial systems—we can thank Stuxnet for that—and at the same time, the tools available for security attacks on ICS and SCADA systems are rapidly improving."
After interviewing more than a dozen industrial control system security professionals, including end users, security researchers, suppliers and experts in just about every industrial vertical, it is clear that the very best answer to the are-we-more-secure question is a resounding, "maybe, maybe not."
"So the answer to the question," Byres says, "is that many ICS and SCADA systems are more secure than they were last year, but the bad guys are better equipped."
John Cusimano, director of security solutions for exida and director of the Repository of Industrial Security Incidents (RISI), says, "Overall the security posture of most control systems is still fairly weak. It varies significantly by industry, though. Major oil and gas and chemical companies are actually doing fairly well."
What does "fairly well" mean? "These companies started working on this topic pre-Stuxnet and have bolstered their programs since," Cusimano continues. "They generally have, or are working on, written policies and procedures specifically for ICS security; have firewalled their ICS networks from their business networks; and have conducted internal security assessments of their critical facilities."
Yet Dr. Erik Johansson, senior affiliated researcher at the Dept. of Industrial Information and Control Systems, Royal Institute of Technology (KTH) in Sweden, when asked the same question said, "I do hope so."
Joe Weiss, principal at Applied Control Solutions, and ControlGlobal.com's security blogger says, "ICS systems in process industries are NOT secure [his emphasis]. The degree of insecurity ranges, depending on the end user. It is not clear if they are more secure than last year, as there are now more identified vulnerabilities and more people aware of ICS cyber vulnerabilities. ICSs in the electric industry are no more secure than in any other industry. In fact, an argument can be made that the NERC CIP process with all its exclusions has made the electric utilities less secure than other industries."
Marcelo Branquinho, executive director of TIsafe, a security consultant in Brazil with more than 15 years experience in ICS and SCADA systems, piles on. "No, they aren't secure at all." But he goes on, "In Brazil, things are becoming more secure now due to some new government regulations and new government publications such as the 'blue book,' the Guia de Referência para a Segurança das Infraistruturas Críticas da Informação, (Safety Reference Guide for Critical Infrastructure Information), a guide for security in ICS that government corporations are starting to follow."
David Mattes, a former end user from the Boeing Co. and now founder of Asguard Networks, says, "I don't have direct experience with the process industries, but I've been paying a lot of attention to the various voices speaking out about ICS security in the different industrial sectors. From what I've heard, not much is being implemented by way of security solutions, but a lot more connectivity is being added, and torrents of vulnerabilities are being disclosed. The sum of the parts then is that process industries are not, in general, secure, and they are less secure than they were last year."
Richard Guida, who retired in 2011 as vice president, worldwide information security at Johnson and Johnson Inc. and is now a part-time consultant in enterprise security, says that control systems are, "less secure, because more PLCs and SCADA systems are being put on internal networks. Hence, they may become accessible over the Internet. So, while the vulnerabilities have not changed, the threats are much worse, given the higher exposure. The risks are much higher now and getting worse every year as more systems get exposed. The attack surface is growing far more quickly than any efforts at securing the systems."
Who Is Attacking? How Do We Defend?
Byres says that there are "Two kinds—dumb mistakes and well-designed advanced persistent threats (APTs). I still see a lot of down time from issues that are caused by simple mistakes—the infected laptop on the plant floor, the consultant connecting in remotely from an insecure home computer, and so on. These are expensive and obvious."
Byres goes on, "Now some people claim that APTs are just marketing hype, but Shamoon, Flame, Stuxnet, Nitro, Night Dragon and Duqu are all good examples of APTs. Trying to wish away APTs as hype is a clear case of sticking one's head in the sand."
Clint Bodungen, security analyst with Amor Group LLC, says, "Operators should defend against any attack that has a relatively significant chance of success of impact to operations where the consequence is greater than what the operator is willing to accept."
Bodungen goes on, "What does that translate to in terms of attack types? Well that could be different between operators. Operations with Windows machines on the process control network, especially those that allow USB media, should be concerned with APTs such as viruses and worms more so than some 'überhacker' cracking through layers of enterprise security and the DMZ to finally get through to the production network."
Ultimately, and more realistically, Bodungen believes that it is much more likely that a cyber breach will occur as a result of an ill-trained user or poor security procedures rather than some sophisticated targeted attack.
"That being said," he continues, "I don't think there are too many hackers that would say, 'I want to attack this SCADA network, but I hope it's a real challenge with a risk of being caught.' Therefore, any common 'low-hanging fruit' could also be threat. Such threats would be any published vulnerability with a known and working exploit in circulation, especially if it has been released as a Metasploit exploit."
Metasploit is open-source security software that is the result of a collaboration between the open source community and Rapid7. Metasploit software helps security and IT professionals identify security issues, verify vulnerability mitigations and manage expert-driven security assessments.
Ivaldi says "Simply put, we should defend against all attacks. To be able to do so, I believe we should shift our focus from threats to operations. Assuming you know what threats exist, when they may hit, how they will come and where they will go is something reserved for risk analysis, which usually leads to variable and likely biased results. Instead the attack surface of and around a target should be thoroughly evaluated in order to understand where the threats, any threats, can attack if they do attack."
Cusimano points out, "A lot of effort is going into protecting the control system from the business network (or vice versa, depending on your perspective). This definitely makes sense. Although a significant challenge, the biggest threat to control systems is all the 'side channels,' meaning the other ways that digital information can get into the control system besides through the business network. In general, there is a real false sense of security out there being attributed to the firewall between business and control. First of all, most of them [the firewalls] are misconfigured. They are put in, and then everyone requests ports to be opened so their applications can work, and after a while, they look like Swiss cheese. Second of all, they only represent protection of one path into the control system. USB sticks, maintenance laptops, CD/DVDs, remote access, modems, wireless access points—all represent just a few of the many ways a control system can be compromised."
Mike Baldi, security architect for Honeywell Process Solutions, agrees, "There is no easy answer for this. Systems have to be protected from the intentional external attack and from the intentional or accidental insider attack. Locking down USB devices and CD/DVD readers significantly improves the security of the system. Using defense-in-depth, least-privilege-required and separation of duties strategies will greatly reduce the attack surface. Once a system is installed securely, it must be monitored continuously for indications of non-normal events that could signal a cyber incident against the system."
From his perspective in enterprise security and government, Guida says, "I am honestly less worried about another country attacking us, with the exception of North Korea, than I am about the possibility of some miscreant, like a spin-off of 'Anonymous,' just deciding to screw with peoples' lives and bring down some infrastructure 'just for the hell of it.' Unlike a country-level attack, a miscreant attacking really is more likely to be just hacking/intrusion over the Internet. A country-level attack could include physical break-in or sophisticated social engineering or traditional spy-level stuff. If systems are not exposed over the Internet or exposed within the company's network so that a successful attack on that network could leapfrog to systems attached to it, that would greatly reduce the attack surface to miscreant attack."
Is Security Another Y2K Fizzle?
There are many who believe that because nothing really bad happened with Y2K, nothing would have happened, and the whole exercise was a farce. Many of those same people, usually senior corporate leaders, appear to believe that cybersecurity in industrial control systems will turn out to be a similar fizzle.
Byres says, "Some people think it was a big waste of money because nothing fell apart on New Year's Eve 1999. But one reason that nothing went wrong was that people really did their homework to detect the Y2K issues up-front. So security could be like Y2K in the fact that if we do a good job, then people will say we need not have bothered because nothing went wrong."
The prevailing opinion from ICS security practitioners is that it's not like Y2K.
"Y2K was a one-time clock issue that had a very specific fix," says Weiss. "ICS security cannot be fixed with any 'silver bullet.' ICS security issues are real and have had devastating consequences to date. A nation-state targeted attack against the electric systems, natural gas pipelines and so forth could be devastating to this country."
ICS Is Different, Way Different
Guida explains ICS IT: "In an enterprise, you have an infrastructure, endpoint devices and humans with all their foibles. In an environment with PLCs and SCADA systems, you have an infrastructure, endpoint devices, humans and embedded systems. So the complexity of the latter is worse—and arguably much worse—because you may not even be aware of where your embedded systems are, what vulnerabilities they possess and how they are exposed."
Mattes explains, "There's a different prioritization between the two environments [ICS and enterprise IT]: The classic availability, integrity, confidentiality (AIC) versus the CIA perspective. ICS security is much more than standard systems, software and processes. Almost everything about ICS security tends to go against IT standards and processes. ICS environments are where enterprise IT was 10 to 15 years ago. We're talking about a high ratio of labor hours to system administrate and support, a lack of management tools, a lack of security capabilities, standards and products, with dynamic networking still in its infancy, and a lack of auditing and compliance capabilities, all with poorly designed software and interfaces. IT personnel need to understand the environment and accept that these environments are often the revenue-generation core of the enterprise."
A SCADA tech for a municipality in Ontario, Canada (who asked to be anonymous) put it pithily: "Explaining ICS IT to an enterprise IT pro is a waste of precious time. If you feel so inclined, see professional help."
For those still inclined, he laid out some rules:
- You cannot interrupt the process. No. Never.
- Do not touch. Ever.
- Here is a nickel. Don't spend it all in one place.
- Protect the process for the next 40 years against any and every eventuality including, but not limited to, the operator running the process, yourself, the equipment which runs the process, and any and all unexpected, uncommunicated, unplanned, last-minute or instantaneous changes to the process.
- Ensure that the documentation suite is clear, concise and understandable by a fourth grader and no more than one page long.
- Now hide all the documentation.
- Expect management to rotate every two years.
- Expect continuous staff rotation.
- Expect you have no time to test.
- Expect changes must be done on live systems on the fly without a safety net.
- Expect there are no spares.
"Explaining enterprise security to an ICS person is simpler," he said. "How would you feel if suddenly at midnight, while you are sleeping comfortably in your bed, a data spill occurs half-way around the world? In the blink of an eye, your life savings is transferred out of your account, all credit is maxed and all your non-liquid assets are transferred and mortgaged to the hilt."
What Are the Metrics for Security?
Jeff Potter, director of security architecture for Emerson Process Management, says, "Security metrics are continuing to prove extremely hard to create, so a pure ROI calculation—I've spent N, and my security is X better—is not available. I'm not sure when it will be—we'd need many more incidents occurring to get a statistically valid sampling, and I hope that situation never occurs!"
Bodungen breaks it down very clearly: "Obviously, if you don't have a breach, then one way or another you have enough security—whether that means you've covered all of your bases, or you happen to have just the right mitigation for the specific attack being performed," he says. "Unfortunately, it's difficult to know the sophistication of the attacker and what attacks he will use, and covering all of your bases with the maximum level of comprehensive security is very resource-intensive. Therefore, at the most basic level, it boils down to this: You have to know what your vulnerabilities are. For each vulnerability, you have to have a good idea of the likelihood that it could be exploited. You have to have a good idea of the consequences or impact to your business should that exploit cause a breach. You must know the estimated cost to mitigate or reduce the risk. Implement mitigation where the consequence exceeds the cost to mitigate, beginning with those vulnerabilities that would have the highest impact to operations and with the greatest likelihood of exploitation."
But how do you measure the result of security improvements? Potter says, "The ISA99 Task Group working on this issue has struggled to create a useful product, and although there are some narrow metrics that have value, like patch frequency and training, I've not seen anything that is remotely comprehensive."
Our anonymous operator from Ontario, Canada, has a different point of view. "For high-severity incidents," he says, "security is always a subset of safety. Therefore, safety metrics apply. That is, you track near misses; track found issues over time; estimate remaining latent issues; and assume there are always latent issues."
"The security metric," says William Miller, president of systems integrator MaCT, should be considered from a perspective that with contemporary approaches using anti-virus and firewalls, the endpoints will be vulnerable to cyber attack. If you look at the problem, it can be seen that today's TCP-enabled systems are insufficient."
Branquinho explains TiSafe's approach. "We normally follow these steps," he says. "Conduct a quantitative risk analysis of the plant to be protected. With this analysis, we can have a precise vision of the value of the assets of the plant and of the total risk the plant suffers in case of security incidents. Based on the risk analysis, we do the summation of values in dollars of the risks classified as critical. We use this value as the basis for calculating the investment. We named it risk analysis value (RAV). Then we calculate 5% to 10% of the RAV as the acceptable range of investment in automation security. We developed this methodology just because there wasn't any metric available for this."
Guida concurs. "To paraphrase a famous person—I think it was Benjamin Disraeli—there are lies, damned lies, statistics and worst of all, ROI calculations related to security. It is simply not possible to come up with reasoned ROI calculations because every situation is unique, and there are no reliable data like those traditionally used for actuarial studies."
How Much Security Is Too Much?
Obviously, there is some contention between experts on how much security a plant should have. If security gets in the way of operating the plant, that may be too much.
Jeff Potter says, "How much security is enough is dependent upon a risk assessment, including the consequences (monetary, HSE, other) associated with a security incident. One can obviously put a dollar value to a production outage, but this is less clear-cut when one is dealing with an environmental release, where there are reputational issues as well as monetary fines, or injury or death of personnel."
Eric Byres says, "I think we make this too complicated. Start by looking at the potential consequences of a deliberate cyber attack. This is completely company/operation specific—if you operate an automatic car wash, then the consequences you face are low, and the money you spend should be low as well. However, if you operate an off-shore oil platform, the consequences are significant, and your security investment needs to be significant as well."
He adds, "You can also look at the probability of an attack, but that is hard and changing fast and for the worse. Therefore I would simplify things and assume that in the next few years, the probability for cyber events will approach that for physical events, such as theft, vandalism, sabotage or accidents. Of course, this is also company/operation specific—the carwash will not have the same enemies as the oil company, but for a specific company the adversaries will likely be similar."
"Now assuming your company has figured out a reasonable risk benefit equation for physical events, and assuming that the consequences and the adversaries will be roughly the same over the next few years, then the cybersecurity spend should be the same, too," Byres says. "Maybe not dollar for dollar, but as an executive at an oil company once said to me, 'If we spend $50 million for fire suppression on our offshore platforms, and we spend $50,000 for cybersecurity on those same platforms, and both types of incidents have the same consequences, then we have a problem. Either we are spending too much on fire suppression or too little for cybersecurity."