This article was printed in CONTROL's April 2009 edition.
By Nancy Bartels and Walt Boyes
When you’re talking about cybersecurity in the process industries, there are only two issues that matter. The first is how much security you need to be really secure. The second isn’t all that obvious, but in many ways defines the first—and it’s one people aren’t thinking about. What’s the difference between “compliance” and “security?”
To find out, we consulted experts on the ground—security consultants, regulations experts, vendors, systems integrators and end users. Not surprisingly, their answers covered a lot of territory, but zeroed in on a central theme: security and compliance are not the same thing. And, while it’s tempting to think that compliance to existing regulations is “good enough” (and the most cost-effective) security, that’s a strategy that can come back to bite you—hard.
Bob Radvanofsky, owner of the SCADASEC automatic emailing list server, or listserv, begins the discussion by parsing the classical definitions of compliance and security. He points out they’re not synonymous and “practically contradict one another.”
“Clearly, the definition of ‘security’ does not constitute a method by which you are ‘complying with’ something. Consequently, being ‘compliant’ does not guarantee that something is ‘secure.’ One deals with compliance based on levels of coercion, meaning that someone or something made you [take certain actions]. Whereas, ‘security’ represents a state of mind, meaning that one feels safe or secure only if certain preventative and/or reactive efforts are implemented.”
Dan DesRuisseaux, manager, Ethernet Marketing Group, Schneider Electric, adds, “Compliance doesn’t assure security. A company can be compliant with internally or externally generated security regulations, but may still be vulnerable to attack. Being compliant with any one standard does not guarantee security.”
Joe Weiss, founder of Applied Control Solutions and author of ControlGlobal.com’s “Unfettered” blog, adds, “Ideally, North American Electric Reliability Corp.’s Critical Infrastructure Protection (NERC CIP) security compliance and securing assets should be complementary. NERC CIP compliance means you’ve met NERC’s requirements. Many people assume NERC requirements lead to secure assets―but they do not! What they lead to is a programmatic approach that may or may not be relevant to actually securing assets.”
Marcus Sachs, executive director, government affairs-national security, Verizon, puts it more bluntly. “Compliance equals auditors are happy. Security equals investors and customers are happy,” he says. “We tried to create a ‘culture of security’ many years ago, but failed. Instead, we created a ‘culture of compliance,’ and it led to a lot of problems. We need to get out of the checkbox mindset and back to thinking like security experts when examining information systems, regardless of whether they’re plant or enterprise IT systems.”
How Close to the Edge?
“How much security you need really depends on how much risk you’re willing to accept,” says John Cusimano, director of security services at exida. “With the understanding that one can never completely eliminate risk, corporations need to quantify their level of tolerable risk, and then design their systems to meet or exceed that level. Compliance measures conformity to a standard or regulation. The relationship between the two is that one can establish a target security level in the form of a tolerable risk level, and measure whether they’re complying with that target.”
Todd Stauffer, PCS 7 marketing manager at Siemens Energy & Automation concurs. “Security is a relative term. There’s almost no way to provide 100% assurance that a system is secure today and will be secure in the future. To maximize security posture, owner/operators should implement a defense-in-depth security concept. This concept leverages technology, such as firewalls, access control, virus scanners, software patch management, physical protection and personnel operating procedures to create a layered defense. These measures must be continually updated and augmented to ensure that newly discovered security vulnerabilities are mitigated.”
How Much Security Is Enough?
Jake Brodsky of Washington Suburban Sanitary Commission (WSSC) describes the end user’s perspective. “How secure is secure enough? That's really the foundation question. It’s like asking how safe should our cars be? We can include all sorts of measures in them, ranging from anti-lock brakes, airbags, seat-belts, crumple zones, safety glass, traction control, etc. However, even this isn't going to help if the driver is reckless. Control systems are like that. The biggest hurdle [to good cybersecurity] is education: ensuring that people understand what they’re doing when they design these things. It is also a matter of teaching people to operate securely.”
Brodsky adds, “As an interim step, we have to mandate a compliance-based approach, with the caveat that this alone may not prevent an attack. In the long run, a compliance-based approach is only a temporary measure until people combine enough experience and knowledge to know better.”
Bob Huba, Emerson Process Management’s cybersecurity expert, says the replay of the “how much” security question should be another question—secure from what? He says, “Without a vulnerability assessment to know what threats you need to protect against, you can’t know if you have sufficient protections in place to mitigate these threats. Yes, you can take a worst-case vulnerability scenario, but then you run the risk of wasting time and money on too much protection and potentially putting your security emphasis on the wrong solutions that don’t make you more secure.”
Ralph Langner, of Langner Communications, a European security consultancy, says, “An automated industrial process is insecure if foreseeable failures or manipulation attempts of automation equipment, SCADA installations or network devices can cause significant or unknown damage. In real life, it’s often easy to determine required mitigation controls to get away from insecure. However, there is a difference between health, safety and environment (HSE) risks and risks relating only to money. For HSE, the budget for mitigation is largely determined by ethics, legislation and compliance, whereas for monetary consequences, budget decisions have to match anticipated monetary loss. Only some of the potential negative outcomes from security HSE events are regulated. For other risks, there is no compliance, but there are still security issues that need to be mitigated. For example, there is no need to patch systems or to install firewalls to be compliant. Companies do so anyway to reduce risk.”
Creating a “Culture of Security”
Marcus Sachs says we have failed to create a culture of security. Huba says we have to succeed. “Creating a secure system goes beyond just complying with a specification,” adds Huba. “It takes creating a ‘culture of security,’ where security is treated as everybody’s job, and they understand the consequences of dangerous behaviors. In my experience, compliance deals more with meeting the minimum, and trying to see what can be ‘gotten away with’ when the auditor is not looking. Complying is something management puts in place and audits―not something an employee does or participates in. Compliance happens around employees not because of them. You will never create a secure industrial control system without employee participation.”
Steve Carson, of Multitrode agrees. “Security focuses on the technology is because it is topical. However, the clearest security weaknesses are always in people’s practices―passwords written in books next to computers, no locks on gates, or when you can just follow a random contractor in through the security gate after he’s been given access.” Multitrode manufactures lift station controls and monitoring devices for water and wastewater utilities.
Doing, Not Saying
So, what should end users be doing to both keep their plants secure and the auditors off their backs? Kevin Staggs, engineering fellow and global security architect, Honeywell Process Solutions, gives a standard formula for determining security. “At a minimum, a plant should isolate the process control network (PCN) from the corporate network. This isolation should be done using a firewall configured to deny all traffic except for connections required between specific PCN nodes and corporate network nodes. The PCN should not be able to reach the Internet directly. A good configuration would also include a DMZ (demilitarized zone) between the corporate network and PCN. The data servers for moving information between the PCN and corporate networks would be located in the DMZ. A best practice for determining how much cybersecurity is required is to perform a PCN cybersecurity assessment of your system.”
In the end, getting the secure systems you need and want at a price the bean counters are willing to pay becomes a balancing act.
Paul Francis, Multirode’s chief technical officer, concludes, “To answer the question ‘How much security is enough?’ implies that there’s a one-size-fits-all solution for every situation, or that you even necessarily know when you’re done, as opposed to it [security] being a continual journey. Just because your system ‘complies’ with a particular directive or set of guidelines does not imply that it’s necessarily inherently secure. Best practice guidance, regulations and compliance documentation all provide critical input into the process, but to achieve adequate security in depth (a single layer is not usually enough) means you have to assess the specifics relating to items such as your geography, communications infrastructure, employee policies and control, technical architecture, protocol support, physical access, training, criticality of data and systems, government or other regulatory body requirements, risk profile and much more. Regular auditing and testing of that framework will then help evolve the model further.”
Walt Boyes is Control’s editor in chief. Nancy Bartels is Control’s managing editor.
We developed sources for this story by posting questions on the SoundOff! Blog, Twitter and the SCADA listserv and got far more responses than we had room for here. For the best in-depth comments from our respondents, and other security resources, go to www.controlglobal.com/0904_SCADA.
Ernie Rakaczky, Principal Security Architect, IPS says it all comes down to the understanding the following definitions and then taking the required actions:
- Exploit—A vulnerability that has been triggered by a threat
- Risk—A possible event that could cause a loss
- Threat—A method of triggering risk
- Vulnerability—A weakness in a target that can potentially be exploited
- Assurance—The level of guarantee that a security system will behave as expected
- Countermeasure—A way to stop a threat from triggering risk
- Defense-in-depth—Never relying on a single security measure, but on a layer of security measures