Cambridge University Security Economics Paper

Ross Anderson and Shailendra Fuloria of Cambridge University have published a paper - Security Economics and the Critical National Infrastructure.  I found a lot to like in the paper. It sets out the security-economics issues that arise with critical infrastructure protection which has been missing. The authors have identified externalities of correlated failures which make control system cyber security so critical to address. The authors documented that control systems investments will be replaced only when they are fully depreciated which is a critical distinction. They also point out that security and reliability should be treated together. I wholeheartedly agree with the statement that the security engineering community already knows how to do things like crypto, protocols, and access controls; what we don’t know how to do is to ensure sustainable implementation and effective use of these technologies in different business environments.

The authors can be expected to be quoted which makes the following observations and corrections so important.

A discussion on culture issues was missing. Culture issues can have a significant impact on the perceived view of risk and security economics. I believe there is typically a culture gap between senior management and Operations (control systems) as to the importance of control systems to business operations and associated business risk.  I also believe that Y2K hurt IT organizations’ credibility with senior management as to business risk from computer incidents. It is difficult to justify a cost-benefit to senior management when the appropriate organizations responsible for risk and cyber security may not be aware of the potential impacts to their business. For example, how do you address the security economics of a cyber-induced 6-12 month regional or national electric outage? 

The paper states the Davis-Besse Nuclear Plant was shutdown by an infected system. In reality, the plant was already in a cold shutdown condition many months before the contractor with the contaminated laptop appeared. The systems impacted by the contaminated laptop had no safety and minimal economic impact. There have been at least two other nuclear plant cyber incidents that had significant economic impacts. Unfortunately, they were not mentioned.

The paper states: “As far as we know, no-one has ever been killed by a cyber-terrorist, and this has limited the attention given by the media to the problems. Some people have even remained sceptical about whether online attacks could do real damage. So in March 2007, the Department of Energy’s Idaho National Laboratory made a video demonstrating the ‘Aurora vulnerability’ in which a series of ‘on’ and ‘off’ commands are sent to a generator, timed in such a way as to bring it out of phase and thus destroy it. The video was released to the press in September 2007; in it, a large generating set shudders, emits smoke, and then stops. This helped make clear to legislators that the confluence of the private but internally open systems using in industrial control, with open networking standards such as TCP/IP, was creating systemic vulnerabilities.”  People have been killed by control system cyber incidents. Whether the cyber incident was caused by a cyber terrorist or not malicious, the people are still dead. Unfortunately, despite the Aurora test, too many people in industry are still not convinced it is real. Specifically March 27th, I wrote a blog - “Aurora is real and still not being addressed” - to respond to the denial of the applicability of the Aurora test. Moreover, the North American electric industry has done very little to date to address Aurora. This is certainly one of the reasons legislation will be forthcoming for control system cyber security.

The paper states: “The combination of the clear societal importance of a dependable energy and water supply, the evident vulnerability of existing systems, the salience of ‘cyber-terrorism’ and the societal sensitisation to terrorism since 9/11 have led to increasing amounts of money and regulatory effort being devoted to it.” Unfortunately, there has been very little money set aside to address control system cyber security and to date self-regulation has not worked (see Mike Assante’s April 7th NERC letter). This is another reason for legislating control system security.

The paper states: “The collision between the proprietary world of industrial control systems and the open world of IP-based networking was a root cause of the current problems with SCADA security. The Internet offers huge cost savings over proprietary networks, and – as in other applications such as banking – there was first a rush to use the new technology to save money, then a realisation that a lot would have to be spent on security in order to deal with the suddenly increased risk of remote attacks. Control systems engineers and vendors are therefore now coming into contact with traditional information security mechanisms, such as patch management and Common Criteria evaluations. A number of tensions are becoming evident.”  I believe the original root cause was the desire by various industries to wean themselves away from proprietary vendor solutions which led to introducing commercial of-the-shelf operating systems such as Windows. There are many other issues such as the need for remote vendor support as well as vendors continuing to provide vulnerable systems. In the early 2000’s, NIST established the Process Control Systems Requirements Forum which among other activities evaluated the Common Criteria for control system applications; it didn’t apply. The movement is to NIST SP800-53.

The paper states: “This appears to be particularly the case with firms from a defence background, while firms whose SCADA business evolved from a civil engineering or computing business tend to favour the normal CERT approach.)”. Unless there is a difference in terminology between the UK and the US with the term “civil engineering”, SCADA and control systems did NOT evolve from a civil engineering approach.  This is not a trivial comment. In December of 2004, the American Society of Civil Engineers wrote a cyber security chapter for the water industry. The discussions on control systems were lacking.

Joe Weiss