What’s the Difference Between Security and Compliance? - The Long Answers

April 9, 2009
More on Protecting Your Assets, Security and Compliance

Read our April 2009 story "A Distinction with a Difference in SCADA Security"

In preparation for our article, “A Distinction with a Difference in Functional Security,” we consulted experts on the ground—security consultants, regulatory experts, vendors, systems integrators and end users. We polled them via email, Twitter, a blog post and the SCADASEC email discussion list. Below are the complete answers our respondents sent.

Joe Weiss, founder of Applied Control Solutions and author of ControlGlobal.com’s “Unfettered” blog, says:

Ideally, NERC CIP security compliance and securing assets should be complementary. NERC CIP compliance means you have met NERC’s requirements. Many people have assumed NERC requirements lead to secure assets―they do not! What they lead to is a programmatic approach that may or may not be relevant to actually securing assets. For example, you can be NERC CIP-compliant while excluding telecom, all distribution, non-routable protocols (even though they may make up 75%-80% of the utility’s control system communications), and even all generation and substations if your “risk assessment” defines them not to be “critica.l”  An example of this shortcoming was exposed two years ago at an ISA Expo session. A NERC representative was asked if it would be possible to be NERC CIP-compliant and still be fined for not meeting NERC reliability requirements. Unbelievably, the answer was YES. If the utility implemented security policies, whether they were appropriate or not, they would be deemed NERC CIP-compliant for implementing policies. However, if those same policies lead to failures affecting grid reliability, the utility could be fined (up to $1 million/day/event) for having significant reliability vulnerabilities, even though it was deemed NERC CIP-compliant! This argument is not hypothetical. Inappropriate policies, procedures and/or testing have led to numerous control system cyber incidents, some of which had very significant impacts.

On the other hand, securing assets means you have determined what you actually have installed and then reduced cyber vulnerabilities by implementing appropriate policies, procedures, technologies, architectures, etc.

An interesting follow-on question is how much security is enough security?  Currently, there is no set answer. Rather, it is a risk answer.  Risk is classically defined as frequency multiplied by consequence. For control systems, there is no statistical basis for cyber security frequency. Consequently, it is prudent to assume a probability of 1―The event will happen. The amount of risk you are willing to accept determines what you will be willing to pay to reduce the risk to an acceptable level. Part of the risk has to include the potential impact on control system performance/facility reliability by implementing security as security was generally not a design feature.

Jesus Oquendo, Chief Information Security Architect, E-Fensive Security Strategies, says:

I come from the technology/security arena and attempted to answer these questions as if I had to sit with my CEO.

How much security you need to be really secure? This is a tricky question. Security will always be different across companies even for those companies in the same field. Security should begin with a top-down approach from a fifty-thousand foot view in order for it to be effective. Because companies are often tasked with generating revenue, security has been considered a losing business deal from managers who don't grasp the entire picture.

Coming from the IT scope, the best mechanism to keeping a secure posture would be to have a properly governed information security architecture. However, this is a long process which involves high-level management and works its way from the top down. The entire explanation would comprise of a book in itself; so to streamline this explanation I offer a scenario, imagine asking for monies to purchase a product which, say, "monitored" all the devices in your shop. Most times that device would be 1) overkill and 2) costly, not to mention you'd need to have the personnel to operate, administrate and manage that device.

In a properly governed architecture, one can isolate the definitive equipment which needs monitoring and implement it there. In doing so, it turns out that it may not take as big, cumbersome and expensive a product. So how much security you need will always depend on how what you are trying to protect. There can never be a magic answer to this question. There will always be a cost, however. There are no magic numbers and there are no metrics methodology one can use to seek those kinds of numbers. Aside from this, too much security can be quite counterproductive.

Now to the question of what's the difference between "compliance" and "security":

Compliance is an often misunderstood term so let's have a quick view of the definition: Compliance is either a state of being in accordance with established guidelines, specifications or legislation or the process of becoming so.

With this said, most understand the term, but many misrepresent the interchangeability between the two, often settling for a low baseline that meets compliance requirements, but fails miserably in the security arena. This is where guidelines and standards both get lost in translation.

Guidelines are merely hints, suggestions and standards are explicit. But guidelines and standards for whom exactly? Most were written from a very narrow point of view and then given to a broad range of businesses. What works in one environment may not work in another. Many security professionals get this concept wrong. They set the expectations of security based on "guidelines" that don't necessarily apply to their business. They'll implement "standards," a baseline set of someone else's standards without taking an real- world view of their own needs and offer this method as a means that they've "secured" something, when all they've done is met a minimum expectation of security―just enough to meet compliance.

It’s part of the security herding instinct: "Company A is doing it, therefore it must be the right thing for me!"

Mike Braun, CISSP CISA CISM ITIL-F, Senior Security Engineer, Verizon Business.

Regarding "compliance vs security": Compliance is doing the minimum necessary to meet an audit or regulatory compliance. Security is doing what is necessary, often within the compliance or audit structure, to reduce risk to an acceptable level defined by the business you’re in.

Jake Brodsky, Washington Suburban Sanitary Commission

How secure is secure enough?

That's really the foundation question. It's like asking how safe should our cars be?  We can include all sorts of measures in them, ranging from anti-lock brakes, airbags, seat-belts, crumple zones, safety glass, traction control, and so on and so forth. Yet even this isn't going to help if the driver is reckless.

Control systems are like that. Currently we have very few prescriptive standards, and the application of the broad concepts borrowed from IT is no simple task. The biggest hurdle is education: Ensuring that people understand what they're doing when they design these things. It is also a matter of teaching people to operate these things securely.

As an interim step, we have to mandate a compliance-based approach, with the caveat that this alone may not save you from an attack. In other words, do this, even if you do not understand exactly why; and if you screw up, you could still get hurt. So study why you should be doing this.

In the long run, a compliance-based approach is only a temporary measure until people combine enough experience and knowledge to know better.

Ralph Langner, Langner Communications AG
Hamburg, Germany

I noticed your request for input on the controlglobal blog. Even though I work in Europe, I want to share some thoughts that you might find helpful.

How much security do you need to be really secure?

The answer certainly depends largely on your definition of security. I define the following as INSECURE: An automated industrial process is insecure if foreseeable failures or manipulation attempts of automation equipment, SCADA installations or network devices can cause damage beyond the insignificant, or can cause damage of unknown extent. In real life, it is often quite easy to determine required mitigation controls to get away from insecure. However, there is a difference between HSE [health, safety and environmental]  risks and risks relating "only" to money. For HSE, budget for mitigation is largely determined by ethics, legislation and compliance, whereas for monetary consequences, budget decisions have to match anticipated monetary loss.
What's the difference between compliance and security?

Simple, if you look at the last sentence above. Only part of the potential negative outcomes from security events; i.e., HSE, are regulated. Compliance requires regulation that you can comply to. For other risks, there is no compliance, but there are still security issues that need to be mitigated. For example, there is no need to patch systems or to install firewalls in order to be compliant. Companies do so anyway in order to reduce risk.

Patrick Coyle
Chemical Facility Security News
[email protected]

Really Secure

There is no such as “really secure” or “absolutely secure”; it is a purely theoretical concept like infinity. By adding security measures you can get closer, but each successive improvement moves you a shorter distance towards the theoretical goal. Unlike the mathematical construct, however, the theoretical goal is periodically moved a random increment further away due the advances made by the opposition—and, that movement is seldom made with the knowledge of the home team.

Compliance versus Security

Since security is unattainable, and for all intents and purposes unmeasurable, organizations establish artificial standards that serve as a surrogate for security. When a component of that organization has met the current standards, they can claim to be “in compliance.” Most people fail to realize that being ‘in compliance’ only bears an indistinct and variable relationship to being secure.

William T. Shaw - PhD, CISSP

As I am sure you well know, “compliance” is more of an issue about legal niceties and being able to claim to have taken reasonable measures, provided reasonable oversight and shown suitable governance in a court of law. You can be in compliance and yet not be secure if you are merely willing to do the minimum necessary to meet regulations and are not actually focused on establishing true security. Most of the regulations regarding industrial automation security are somewhat vague and open to a range of possible interpretations. That is one reason that groups like the ISA’s SP-99 committee are trying to generate more detailed, specific recommendations. The NERC CIPs are an interesting mix of specific requirements (like requiring an IDS for an EMS/SCADA system) and generalities (like the lack of a standardized, defined methodology for performing vulnerability assessments.) An organization could come into compliance with the “‘letter” of those requirements and yet still have security holes. As someone once said, perfect security is not possible and obtaining it would be prohibitively expensive. So organizations have to make a basic business decision about how secure they need to be, and then let that be part of the equation when implementing a security program. Numerous electric utilities ignored the initial NERC cybersecurity pronouncements (1200/1300) and took a wait-and-see attitude, both because the recommendations were somewhat vague and because they had no enforcement “teeth.’ Among various organizations I have consulted with some immediately wanted to know what the minimum amount of effort was, that would allow them to claim compliance. They were not really interested in security (or felt they already were secure enough) only in compliance.

Bob Radvanofsky, owner of the SCADASEC email discussion list, [email protected]

The definitions of "compliance" and "security" (practically) contradict themselves. The definition of "compliance" (n) is

  1. (a) the act or process of complying to a desire, demand, proposal or regimen or to coercion; or, (b) conformity in fulfilling official requirements;
  2. a disposition to yield to others;
  3. the ability of an object to yield elastically when a force is applied.

"Security" (n), is defined as:

  1. the quality or state of being secure: as (a) freedom from danger; safety (b) freedom from fear or anxiety; or, 
  2. freedom from the prospect of being laid off
  3. (a) something given, deposited or pledged to make certain the fulfillment of an obligation; or (b) surety
  4. an instrument of investment in the form of a document (as a stock certificate or bond) providing evidence of its ownership
  5. (a) something that secures; protection (b) (1): measures taken to guard against espionage or sabotage, crime, attack, or escape; or, (b (2): an organization or department whose task is security.
    [taken from Merriam-Webster’s online dictionary: http://www.merriam-webster.com
  6. To me, these definitions are not synonymous. Clearly, the definition of "security" does NOT constitute a method by which you are "complying to" something; consequently, "being compliant" does NOT guarantee that "something is secure." Interestingly enough, one deals with complying based upon levels of coercion (meaning, that someone or something made you do something against your will, or forcibly caused you do something; as in "if you don't do this, something bad will happen to you."); whereas, with "security" representing a state of mind, meaning that one feels safe or secure only if certain preventative and/or reactive efforts are implemented against an event, incident, element, factor, or threat, internal or external.

    Observationally, I have noticed a trend within the energy sector to indicate that organizations feel "secure" or are stating that they are "secure" by being "NERC- compliant".  Being "compliant" does NOT constitute that an organization is "secure," and in addition, being "compliant" does NOT constitute that an organization CANNOT be fined or sued over a security breach. To me, many organizations appear to have a false sense of feeling "safe" by stating that they are meeting a minimum requirement stating that they are "NERC-compliant." Interestingly enough, NERC is neither a standards body nor industry expert when it comes to "security," as its primary charter deals with the reliability of providing energy/power to the North American Grid (henceforth, the name of "North American Electricity Reliability Council").

    Additionally, both "compliance" and "security" need to be constantly reviewed and mitigated accordingly. If an organization is "compliant" at the time of a security breach, it is still possible that it can remain "compliant," but is now vulnerable to an attack (operationally, cyber, physically, or otherwise), and is therefore, "not secure" (but, still "compliant"). To be "secure" means that, at that time, whatever methods, processes and procedures have been implemented provide an adequate level of safety and surety that an area, environment or the entire enterprise or the organization is "secure." But...humans are ingenious creatures of habit and can (and will) find weaknesses and/or vulnerabilities in any given area of expertise, subject, or topic and it continues to be a game of "cat and mouse", with an almost endless game back and forth between the "cat" (the attacker) and the "mouse" (the victim).

    Lastly, "compliance" represents a minimum level of coercion; meaning, that as an organization, you meet the minimum requirement(s) to satisfy .  Whereas, with "security," the premise isn't so much a level of minimum or maximum, but "as needed" or "as necessary," not necessarily signifying that you (as an organization) are "complying."  Conversely, an organization can be "secure," but not "compliant."  Compliance deals with static conditions, either conditions that are repeatedly consistent, or do not change or alter over the course of time. Security, however, assumes nothing is static and works to solve , and evolves (or should evolve) over a period of time; therefore, security is non-static, but dynamic, and covers aspects of human dynamics.


    Bob Huba, DeltaV Product Manager, Emerson Process Management   [email protected]

    The question of how much security do you need to be secure has first to be answered “secure from what?” As an example, during a recent conversation on security I asked a colleague “Do you think you could keep me from breaking into your home?”  He replied “Probably not.” I proceeded to mention that the proper reply to that question is “Who are YOU.”  If I am a high school kid trying to find an unlocked front door as I walk home from school then yes, you can, and it is easy – just be sure your door is locked.  If I am a professional burglar with a crowbar – then you need more protection than simply locking the door.

    The point I am trying to make here, and this goes to the second question as well, is that without a vulnerability assessment to know what threats you need to protect against, you can’t know if you have sufficient protections in place to mitigate these threats. Yes, you can take a worst-case vulnerability scenario, but then you run the risk of wasting time and money on too much protection and potentially putting your security emphasis on the wrong solutions that really do not make you more secure.

    Even though there is really no defined industrial security specification, I am seeing customer requests of “can you meet this spec?” and then they hand us a NIST or NERC or some other system-wide security specification. Much of the time they have not really read the specification and have not taken any time to determine how the specification applies to their specific situation. On one occasion when handed a familiar specification I replied ,“Yes we can meet this…can you?”  Upon discussion the customer was rather surprised to find that most of the specification dealt with activities they had to perform―the control system was just a facilitator that helped them implement and meet the security policies and programs they needed to put in place to meet the specification.

    Let’s go back to the question of “how much security to you need?”  Technically you only need enough security to make the “bad guy” give up and try to hack in someplace else; that is, enough security to where the time spent hacking in is not worth the reward. So if you are a big, disliked multi-national company you probably need to spend a lot more on cybersecurity than some small regional company where the reward for breaking in is significantly lower. Even saying this much is the beginning of a risk assessment, which is the only way to adequately answer the “how secure” question. As it will also answer the question about “how much,” you also need to answer the question of how much security can I afford. A vulnerability assessment will also help you spend your security resources wisely on those places where you will get the “biggest bang for your buck.”

    To answer the question on what is the difference between “compliance” and “security,” let me compare a security program with a plant personnel safety program. To maintain a “safe plant” takes more than just complying with the safety standards. It requires creating a “culture of safety” where safety is everybody’s job and where people watch for and report unsafe conditions and practices—a  plant where safety is celebrated and rewarded.  Even though complying with safety rules might make their job more difficult, people understand the consequences and don’t take safety shortcuts. 

    In the same way creating a secure system goes beyond just complying to a specification.  It takes creating a “culture of security” where security is treated as everybody’s job and they understand the consequences of unsecure behaviors.  I recently visited a customer site and saw a security Post-it note on a cubicle that said, “Good job… no security violations were found in this office.”  Upon asking, I was told that the security guards go around after hours and look for unsecured laptops or sensitive data laying in plain sight and other insecure situations. Then they post “atta boys” when they find people doing the right things. This is what I mean by creating a “culture of security.”

    In my experience, compliance deals more with meeting the minimum and trying to see what can be “gotten away with” when the auditor is not looking. Complying is something management puts in place and audits―it is not something an employee does or participates in. Compliance happens around employees not because of them.

    You will never create a secure industrial control system without employee participation.

    Marcus H. Sachs, Executive Director,
    Government Affairs - National Security Policy,
    Verizon
    [email protected]

    Short answer: Compliance = auditors are happy. Security = investors and customers are happy.

    We tried to create a "culture of security" many years ago but failed. Instead we have created a "culture of compliance," and it has lead to a lot of problems. We need to get out of the checkbox mindset and back to thinking like security experts when examining information systems (regardless of whether they are plant systems or enterprise IT systems.)

    The first question can be answered by understanding risk. "Enough security" is reached when your acceptable risk level is below some established point. A plant (or process or organization or person) cannot ever be 100% "secure," so it's pointless to try to get there.  In economics it's called the principle of diminishing returns. You have to find a point where spending a dollar more on security only buys you 99 cents of risk reduction. After that point, you are throwing away money, since the incremental cost of a breach is less than the incremental cost of preventing the breach.


    Steve Carson
    [email protected]
    Multirode is manufacturer of lift station controls and monitoring devices for water and wastewater utilities.

    My comment is that what most people in water/wastewater utilities are talking about when they talk about security is communications security and also SCADA system security (firewalls, Internet client vulnerabilities and so on). Within those two, communications is the most vulnerable―if it is over a radio network, as a SCADA system can be locked down by IT staff much more easily with tools that they already understand.

    Security focuses on the technology is because it is topical. But the clearest security weaknesses are always in people's practice: Passwords written in books next to computers, no locks on gates, or you can just follow a random contractor in through the security gate after he has been given access.

    Paul Francis, Multirode CTO

    "To my mind, the two points are inter-related. To answer the question “how much security is enough?” implies that there's a one-size fits all solution for every situation: or that you even necessarily know when you are done, as opposed to it being a continual journey.

    In much the same way, just because your system “complies” with a particular directive or set of guidelines does not imply that it's necessarily inherently secure. Best practice guidance, regulations and compliance documentation all provide critical input into the process, but to achieve adequate security in depth―a single layer is not usually enough―means you have to assess the specifics relating to items such as your geography, communications infrastructure, employee policies and control, technical architecture, protocol support, physical access, training, criticality of data and systems, government or other regulatory body requirements, risk profile and much more.

    Regular auditing and testing of that framework will then help evolve the model further.

    John Cusimano, Director, Security Services, exida
    [email protected]

    The answer to the question “how much security do you need” really depends on how much risk are you willing to accept. With the understanding that one can never completely eliminate risk, corporations need to quantify their level of tolerable risk and then design their systems to meet or exceed this level. Risk-based methodologies have been successfully applied to Safety Instrumented System design since the late 1990s and can arguably applied to security design as well.  Similarly, risk assessment methodologies can be applied to security assessments to quantify the risk inherent in a system. Once a risk level has been determined, system architects can apply defense-in-depth strategies to mitigate the risk to critical assets to a tolerable level.  

    Compliance is measure of conformity to a standard or regulation; whereas, security is defined as being “free from danger or injury.”  The relationship between the two is that one can establish a target security level in the form of a tolerable risk level and measure whether one is compliant to that target. 

    Todd Stauffer
    PCS 7 Marketing Manager, Siemens Energy & Automation
    http://www.sea.siemens.com/industrialsecurity

    How much security do you need to be secure?

    Security is really a relative term. There is almost no way to provide 100%t assurance that a system is secure today and will be secure in the future. Some would say that an isolated system (air gap) is secure. That is true―until an engineer brings a memory stick from his office PC and connects it to the isolated system to transfer files.

    Security is also not a static concept. It is continually changing. To maximize security posture, owner/operators should implement a defense-in-depth security concept. This concept leverages technology (such as firewalls, access control, virus scanners), software patch management, physical protection and personnel operating procedures to create a layered defense.These measures must be continually updated and augmented to ensure that newly discovered security vulnerabilities are mitigated.

    Security also has a risk vs. reward curve that differs by business and by industry. For example, the potential consequences of a cyber incident to a piece of the U.S. critical infrastructure (like a nuclear power plant) could be catastrophic.

    What’s the difference between “compliance” and “security”?

    Wikipedia defines compliance as “the act of adhering to, and demonstrating adherence to a standard or regulation.”  For security this would imply adherence to a security regulation. But, unfortunately, there is no one accepted security regulation that governs process and SCADA applications across industries. The NERC Critical Infrastructure Protection (CIP) Cyber Security Standards govern electrical/bulk power industries. The Chemical Facility Antiterrorism Standard (CFATS) applies to high-risk chemical facilities.

    To fill the security compliance gap, ISA has created the Security Compliance Institute (ISCI). One of the main goals of this consortium is to define a security test specification that can be used for compliance testing of devices and systems. The institute will also coordinate the testing of devices against the compliance test specification resulting in the granting of the “ISASecure” certification. The testing protocol is being developed based on industry best practices and the work of various standards such as ISA 99.

    Darrell Pitzer

    On a SCADA security mailing list, I saw a request for perspectives on two issues:

    1. How much security do you need to be really secure?
    2. What's the difference between "compliance" and "security"?

    I don't mind responding to these, but the opinions are, of course, my own and not any position of the company that employs me.

    1. How much security do you need?

    It depends on what you are trying to defend from. If it is from outsider attacks/intrusions, then it is an ongoing process. The methods of attack change constantly, so the security defenses you use must also change constantly. There is no such thing as "enough," since the game is always changing. 

    If, on the other hand, you are trying to secure your systems from insider threats, the problem is different, but also equally difficult. It comes down to individual accountability. If something happens, can you tell precisely who did it and when? If you can't, then you are inadequately prepared. If you can, and something does happen, at least you can determine the guilty party and take action against them. If speed and/or flexibility are important to your business, you have to work hard to maintain individual accountability. If you have assets to protect at all costs, then speed and flexibility are not factors, and you define very strict rules to ensure individual accountability.

    2. "Compliance" vs "Security"

    Compliance is dotting the i's and crossing the t's for some form that is required by your company, or required by law. It really has nothing to do with security or secure systems. FISMA is such an example.

    Security is making sure that you protect your assets.

    I would wager that companies who are secure are also compliant. Companies that are compliant are not necessarily secure.

    Ernie Rakaczky, Principal Security Architect
    IPS

    To position the “compliance versus security question” correctly, one must first take compliance/compliancy and realize from the outset that there are multiple levels in being compliant, but that the number one focus always seems to be around that one snapshot in time―the audit.

    Holistically, compliancy is imperative for a successful cyber or physical security program. Being compliant can range from complying with an auditable set of federally mandated guidelines, like NERC CIP, to simply complying with an internal policy or procedure that has been established within a security program. Compliancy requirements drive the overall management of a security program, and it is within these requirements that many necessary security measures will be established.

    Here again, people first look at security measures as a means to protect, and that is paramount, but security measures also need to include monitoring, correlation, rationalization, etc. That is how you can begin to evaluate if the investment in security outweighs the value of the assets you are trying to protect.

    Compliancy can’t guarantee that established security measures of an organization will never be breached. But compliancy and practical security measures serve as the foundation for a culture where everyone feels like they have an equal stake in the overall security and success of the operation. In a security culture, the security program is part of the plant’s normal way of life.

    So how do you know if you have enough security to be really secure?

    To answer that, we have to use the basic definition of security as the degree of protection against danger, loss and criminals.

    To understand, manage and reduce risk, the following elements need to be considered. It all comes down to:

    • Assurance - the level of guarantee that a security system will behave as expected;
    • Countermeasure  - a way to stop a threat from triggering risk;
    • Defense-in-depth  - never relying on a single security measure, but on a layer of security measures;
    • Exploit - a vulnerability that has been triggered by a threat;
    • Risk - a possible event that could cause a loss;
    • Threat - a method of triggering risk;
    • Vulnerability - a weakness in a target that can potentially be exploited.

    So how does this all play into determining security requirements?

    For the most part, today’s conversations on security focus on the cost of and/or burden of implementing security. Plant and enterprise managers know that something has to be done, but at the same time, they want to know what they can do to reduce the cost of their investment while still mitigating risk.

    In some regards, just doing even a little will raise the security of the infrastructure substantially.

    Something that is frequently overlooked is the fact that today’s control systems and their interconnectivity can be used not only to improve operations, i.e., controlling the process better to move more down the wire, out the pipe, etc, but also for improving the flow of business information across the enterprise. That means that data that has always been there is more valuable now, and when it all comes together from several plants, it allows an operation to make better, more fundamentally sound day-to-day business decisions. So what we really need to start doing is quantifying that data and the value it brings to an organization to help them make a solid business case for investing in security.

    At its core, the basic security requirement is to protect the process and operation from disruptions and all the issues that derive from an event. Some of the risks within process controls are related to policies that require moving existing data out of the actual control systems, things like EPA reporting, historian information, production management information, etc.

    It is safe to say that no plant is going to go back in time and reinstate operational procedures, as was common practice in the past, via manual logging, data transfer via manual recoding, trying to estimate a value on a pen chart, etc. The challenge today is to identify these common practices and clearly show how continual modernization has saved and enhanced the financial models of the operating organization. By doing that, we should start to see the clear value and importance of applying stronger security to reduce risk, maintain business continuity and provide the ability for greater visibility into process areas.

    Dan DesRuisseaux, Manager, Ethernet Marketing Group
    Schneider Electric

    1) How much security is enough?

    Security requirements will vary based on application and the criticality of the process.  An application that monitors readings from sensors and does no active control will have different security requirements than a system that controls a complex, mission-critical application. Other variables, such as public perception, downstream processes and environmental impact, also weigh heavily on the decision to add more or less security. Additionally, security can have a price tag associated with it, thus security requirements must be weighted based on the business impact of a breach and the up-front cost.  

    2) What is the difference between being secure and compliant?

    Compliance does not assure security. A company can be compliant with internally or externally generated security regulations (NERC CIP), but may still be vulnerable to attack. To properly prevent security issues, a company must design a system that can defend against both external and internal attacks. Being compliant with any one standard does not guarantee security.

    Robert Huber, Co-Founder of Critical Intelligence
    [email protected]

    What is the difference between "compliance" and "security"?

    Compliance is black and white, binary.  You either are, or aren't compliant. Yes, auditors/regulators add shades of gray.  In compliance, you are measured against some type of standard. Compliance is synonymous with obedience. Security on the other hand, implies a state, or feeling, assurance that you are free of danger or risk. There is no yardstick to measure against in this case. It is very subjective, leading back to the first question.

    Sean McBride, Co-Founder of Critical Intelligence
    [email protected]

    How much security do you need to be really secure?

    One way to answer this question is to say that it depends on the determination of an adversary to attack your organization. Successful defenders will push the cost of attack beyond the adversary's ability or desire to conduct it.

    Although each adversary's level of determination to attack your organization will differ, one piece of conventional wisdom tells us, "only more than my competitor," highlighting the preference of some attackers to gather first the low-hanging fruit. Moreover, low-hanging fruit can be proving grounds for more difficult, higher-stake future attacks.

    Relying on this conventional wisdom, you must ask yourself, how do I make my organization a less attractive target?  One baseline to consider is "due diligence," which we generally define as adherence to industry standards and regulations. So, to be a less attractive target than your competitors, you've got to be doing more than due diligence.

    One of the disadvantages to a transparent society is that through publicly available information adversaries can learn what groups are not doing due diligence. (Although intended to be devoid of technical detail, consider last year's public GAO report on security at the Tennessee Valley Authority for example.) Hence we see that hand in hand with due diligence comes the concept of operational security or OPSEC―the principle of not allowing attackers to easily obtain information to aid in attacks. Conventional wisdom would tell us that the easier you make it for adversaries to learn about you (your systems, your networks, your organization), the easier it will be for them to successfully attack you. As an operator of critical infrastructure control systems in the Internet age you must not fail to consider that little things you disclose about yourself in public forums may be used against you.

    When adversaries match the aforementioned public information about your organization with vulnerabilities and exploits you are at risk. The good news is that the cost of OPSEC is nothing more than the cost of personal responsibility.

    So how much security do you need? It honestly depends on who your adversaries are. But you can be sure that due diligence and operational security are cornerstones of "how much you need to be really secure."

    Kevin Staggs, Engineering Fellow, Global Security Architect
    Honeywell Process Solutions

    How much cyber security do you need to be really secure?

    There is no single answer to this question. The amount of cybersecurity needed depends on the plant network configuration and the amount of risk that a user is willing to accept with respect to cyber security. Not all plants and configurations will require all cybersecurity mechanisms. At a minimum, a plant should isolate the Process Control Network (PCN) from the corporate network. This isolation should be done using a stateful firewall configured to deny all traffic except for connections required between specific PCN nodes and corporate network nodes. It is recommended that the PCN not be able to reach the Internet directly. A good configuration would also include a DMZ between the corporate network and PCN. The data servers for moving information between the PCN and corporate networks would be located in the DMZ.

    A best practice for determining how much cyber security is required is to perform a PCN cybersecurity assessment of your system. This assessment would evaluate such things as:

    • Firewall Management
    • DMZ Management 
    • Terminal Server Management 
    • OS Patch Management
    • VPN Remote PCN Access Management 
    • Automated PCN Vulnerability Scanning 
    • Intrusion Detection/Prevention
    • Anti-Virus Updates

    The assessment would review the above items and assign a risk value for each of them.  In addition to the risk value, suggested mitigations are documented. With the risk assessment, a SCADA system administrator would be able to evaluate each of the items and associated risk. Determine if the risk is acceptable or requires mitigation. Once complete, the system would be at an acceptable security level for that operation.

    Cybersecurity assessments should be repeated at least on an annual basis and higher risk systems should be assessed more often.

    What’s the difference between “compliance” and “security”?

    Compliance is an element of security. Most security management programs define a process for managing the security of systems. The process includes measures to determine if the security processes are being followed.  A security program is considered in compliance when there is auditable evidence that all security processes are being followed.