Read our April 2009 story "A Distinction with a Difference in SCADA Security"
In preparation for our article, “A Distinction with a Difference in Functional Security,” we consulted experts on the ground—security consultants, regulatory experts, vendors, systems integrators and end users. We polled them via email, Twitter, a blog post and the SCADASEC email discussion list. Below are the complete answers our respondents sent.
Joe Weiss, founder of Applied Control Solutions and author of ControlGlobal.com’s “Unfettered” blog, says:
Ideally, NERC CIP security compliance and securing assets should be complementary. NERC CIP compliance means you have met NERC’s requirements. Many people have assumed NERC requirements lead to secure assets―they do not! What they lead to is a programmatic approach that may or may not be relevant to actually securing assets. For example, you can be NERC CIP-compliant while excluding telecom, all distribution, non-routable protocols (even though they may make up 75%-80% of the utility’s control system communications), and even all generation and substations if your “risk assessment” defines them not to be “critica.l” An example of this shortcoming was exposed two years ago at an ISA Expo session. A NERC representative was asked if it would be possible to be NERC CIP-compliant and still be fined for not meeting NERC reliability requirements. Unbelievably, the answer was YES. If the utility implemented security policies, whether they were appropriate or not, they would be deemed NERC CIP-compliant for implementing policies. However, if those same policies lead to failures affecting grid reliability, the utility could be fined (up to $1 million/day/event) for having significant reliability vulnerabilities, even though it was deemed NERC CIP-compliant! This argument is not hypothetical. Inappropriate policies, procedures and/or testing have led to numerous control system cyber incidents, some of which had very significant impacts.
On the other hand, securing assets means you have determined what you actually have installed and then reduced cyber vulnerabilities by implementing appropriate policies, procedures, technologies, architectures, etc.
An interesting follow-on question is how much security is enough security? Currently, there is no set answer. Rather, it is a risk answer. Risk is classically defined as frequency multiplied by consequence. For control systems, there is no statistical basis for cyber security frequency. Consequently, it is prudent to assume a probability of 1―The event will happen. The amount of risk you are willing to accept determines what you will be willing to pay to reduce the risk to an acceptable level. Part of the risk has to include the potential impact on control system performance/facility reliability by implementing security as security was generally not a design feature.
Jesus Oquendo, Chief Information Security Architect, E-Fensive Security Strategies, says:
I come from the technology/security arena and attempted to answer these questions as if I had to sit with my CEO.
How much security you need to be really secure? This is a tricky question. Security will always be different across companies even for those companies in the same field. Security should begin with a top-down approach from a fifty-thousand foot view in order for it to be effective. Because companies are often tasked with generating revenue, security has been considered a losing business deal from managers who don't grasp the entire picture.
Coming from the IT scope, the best mechanism to keeping a secure posture would be to have a properly governed information security architecture. However, this is a long process which involves high-level management and works its way from the top down. The entire explanation would comprise of a book in itself; so to streamline this explanation I offer a scenario, imagine asking for monies to purchase a product which, say, "monitored" all the devices in your shop. Most times that device would be 1) overkill and 2) costly, not to mention you'd need to have the personnel to operate, administrate and manage that device.
In a properly governed architecture, one can isolate the definitive equipment which needs monitoring and implement it there. In doing so, it turns out that it may not take as big, cumbersome and expensive a product. So how much security you need will always depend on how what you are trying to protect. There can never be a magic answer to this question. There will always be a cost, however. There are no magic numbers and there are no metrics methodology one can use to seek those kinds of numbers. Aside from this, too much security can be quite counterproductive.
Now to the question of what's the difference between "compliance" and "security":
Compliance is an often misunderstood term so let's have a quick view of the definition: Compliance is either a state of being in accordance with established guidelines, specifications or legislation or the process of becoming so.
With this said, most understand the term, but many misrepresent the interchangeability between the two, often settling for a low baseline that meets compliance requirements, but fails miserably in the security arena. This is where guidelines and standards both get lost in translation.
Guidelines are merely hints, suggestions and standards are explicit. But guidelines and standards for whom exactly? Most were written from a very narrow point of view and then given to a broad range of businesses. What works in one environment may not work in another. Many security professionals get this concept wrong. They set the expectations of security based on "guidelines" that don't necessarily apply to their business. They'll implement "standards," a baseline set of someone else's standards without taking an real- world view of their own needs and offer this method as a means that they've "secured" something, when all they've done is met a minimum expectation of security―just enough to meet compliance.