Dale Peterson wrote a blog at www.digitalbond.com stating that "People Are Not THE Answer" to ICS cyber security. I disagree with Dale and have frequently stated that the 75% silver bullet for ICS cyber security is appropriate policies, procedures, training, and architecture. I believe the culture clash between IT and Operations is still the number one ICS cyber security problem. Relying on technology can actually exacerbate ICS cyber security problems and reinforce the cultural divide between IT and Operations.
Dale stated: "An argument can be made that processes are even more important than people. If technology limited what USB drives could be used, and there was a secure process for passing data between zones, would it really matter if the people understood the security ramifications. It's not uncommon for processes to be followed in ICS without the people understanding the reasoning behind the process. When X alarm goes off, I do Y and Z." Responding in a rote fashion to an event without understanding its cause can, and has, resulted in major impacts. This is because of wrong procedures being applied, precursors ignored, bypassing or disabling technology if it appears to be an impediment or the technology appears to provide an inconsistent action, etc. This is neither good engineering nor common sense. Dale stated: "Solutions should require the ICS team to think as little about cyber security as possible. It should be built in to the solutions and vendors should be deploying the solutions in a secure manner. A world where a large portion of the ICS team needs significant security training is a failure." I vehemently disagree. Cyber may result in previously unanticipated problems with the process, the operator displays, or both. There have already been numerous ICS cyber incidents because of lack of appropriate ICS cyber security awareness and training. The 2003 Northeast Outage, the 2008 Florida Outage, the 2009 Air France crash, and Stuxnet are examples of what can happen when you rely on technology and don't have the appropriate training to understand what is really happening. A more mundane example provides a glaring example of why people are so important to ICS cyber security. Engineers at a major beverage bottler thought the company's bottling systems was secured until someone with access logged in and inadvertently changed a timer for a maintenance device on a filler that fills 1200 bottles/minute. It was supposed to squirt grease into the bearing every 20 minutes but was changed to once every eight hours. As a result, the bearing soon froze resulting in a significant loss. As the plant engineer said, with well intentioned engineers monkeying around the automation system, who needs terrorists or disgruntled employees?
Dale stated: "The real hindrance to an organization that is serious about securing their ICS is the technology in the ICS itself." ICS technology does exactly what it was designed to do and it does it in a very reliable manner. The fact that security was not a design consideration is the problem. Since security is an add-on, systems become more complex (more fragile) and less reliable (less robust) making training and awareness even more important. I do not believe that trying to make the problem go away exclusively with technology will be successful. Developing approaches (technical and procedural) to secure legacy ICSs is a major focus with the utility test bed project (see previous blogs).
Dale stated: "You can have the most well trained, professional people in the world, and if the bad guy or malware reaches the ICS it is all over. The underlying process is either stopped or modified, and the degree of damage is only limited by the time and skill of the attacker and an, inaccessible from the ICS, safety system." Adding on to Dale's thoughts, it is not possible to hide the impact of an ICS cyber incident if it affects performance or safety. There have already been real world examples of ICS cyber incidents with plants shutting down, lights going off, pipes bursting, major environmental discharges, equipment destruction, etc. However, with a lack of ICS cyber forensics and logging, it may not be possible to identify that cyber played a role without appropriate ICS cyber security training and awareness. This is precisely why ICS cyber security training and system resilience are so important.