Even though there is really no defined industrial security specification, I am seeing customer requests of “can you meet this spec?” and then they hand us a NIST or NERC or some other system-wide security specification. Much of the time they have not really read the specification and have not taken any time to determine how the specification applies to their specific situation. On one occasion when handed a familiar specification I replied ,“Yes we can meet this…can you?” Upon discussion the customer was rather surprised to find that most of the specification dealt with activities they had to perform―the control system was just a facilitator that helped them implement and meet the security policies and programs they needed to put in place to meet the specification.
Let’s go back to the question of “how much security to you need?” Technically you only need enough security to make the “bad guy” give up and try to hack in someplace else; that is, enough security to where the time spent hacking in is not worth the reward. So if you are a big, disliked multi-national company you probably need to spend a lot more on cybersecurity than some small regional company where the reward for breaking in is significantly lower. Even saying this much is the beginning of a risk assessment, which is the only way to adequately answer the “how secure” question. As it will also answer the question about “how much,” you also need to answer the question of how much security can I afford. A vulnerability assessment will also help you spend your security resources wisely on those places where you will get the “biggest bang for your buck.”
To answer the question on what is the difference between “compliance” and “security,” let me compare a security program with a plant personnel safety program. To maintain a “safe plant” takes more than just complying with the safety standards. It requires creating a “culture of safety” where safety is everybody’s job and where people watch for and report unsafe conditions and practices—a plant where safety is celebrated and rewarded. Even though complying with safety rules might make their job more difficult, people understand the consequences and don’t take safety shortcuts.
In the same way creating a secure system goes beyond just complying to a specification. It takes creating a “culture of security” where security is treated as everybody’s job and they understand the consequences of unsecure behaviors. I recently visited a customer site and saw a security Post-it note on a cubicle that said, “Good job… no security violations were found in this office.” Upon asking, I was told that the security guards go around after hours and look for unsecured laptops or sensitive data laying in plain sight and other insecure situations. Then they post “atta boys” when they find people doing the right things. This is what I mean by creating a “culture of security.”
In my experience, compliance deals more with meeting the minimum and trying to see what can be “gotten away with” when the auditor is not looking. Complying is something management puts in place and audits―it is not something an employee does or participates in. Compliance happens around employees not because of them.
You will never create a secure industrial control system without employee participation.
Marcus H. Sachs, Executive Director,
Government Affairs - National Security Policy,
Short answer: Compliance = auditors are happy. Security = investors and customers are happy.
We tried to create a "culture of security" many years ago but failed. Instead we have created a "culture of compliance," and it has lead to a lot of problems. We need to get out of the checkbox mindset and back to thinking like security experts when examining information systems (regardless of whether they are plant systems or enterprise IT systems.)
The first question can be answered by understanding risk. "Enough security" is reached when your acceptable risk level is below some established point. A plant (or process or organization or person) cannot ever be 100% "secure," so it's pointless to try to get there. In economics it's called the principle of diminishing returns. You have to find a point where spending a dollar more on security only buys you 99 cents of risk reduction. After that point, you are throwing away money, since the incremental cost of a breach is less than the incremental cost of preventing the breach.
Multirode is manufacturer of lift station controls and monitoring devices for water and wastewater utilities.
My comment is that what most people in water/wastewater utilities are talking about when they talk about security is communications security and also SCADA system security (firewalls, Internet client vulnerabilities and so on). Within those two, communications is the most vulnerable―if it is over a radio network, as a SCADA system can be locked down by IT staff much more easily with tools that they already understand.
Security focuses on the technology is because it is topical. But the clearest security weaknesses are always in people's practice: Passwords written in books next to computers, no locks on gates, or you can just follow a random contractor in through the security gate after he has been given access.
Paul Francis, Multirode CTO
"To my mind, the two points are inter-related. To answer the question “how much security is enough?” implies that there's a one-size fits all solution for every situation: or that you even necessarily know when you are done, as opposed to it being a continual journey.