Disclosures, FUD, and the need to maintain credibility

The issue of disclosure is not just of software and programming vulnerabilities, but also disclosure of events. I have been following the issue of disclosures and FUD for quite a while and generally have been silent on the discussion. The recent Wonderware disclosure on the Digital Bond web site inspired this response. The predominant concern with disclosures is exactly that- disclosures create security and/or business issues and inform the "bad guys" of things you don't want them to know. My concern goes deeper than making the disclosure - it is the technical validity of the disclosure. More than four years ago, DOE funded Carnegie-Mellon CERT and KEMA (myself) to develop a scoping study for establishing a CERT for control systems (which was not made public). There were certain fundamental tenets in that report. The primary recommendation was that a CERT for Control Systems needs to have control system expertise. The recent National Infrastructure Advisory Council (NIAC) report dated January 16, 2007 "Convergence of Physical and Cyber Technologies and Related Security Management Challenges - Working Group Final Report and Recommendations by the Council" concurred. However, recent disclosures demonstrate that expertise does not yet exist: Specific examples: - US CERT issued an advisory on November 16, 2006 on Worm Outbreak at Public Power Company.  This advisory was fraught with confusion. In fact, NERC issued the following note on November 21, 2006:  "The NERC ES-ISAC recognizes that the report distributed this morning regarding the Worm Outbreak at Power Company suffered from a significant lack of detail.  US-CERT is working to address this lack of detail, and further information should be forthcoming." It turns out this was an e-mail worm that had nothing to do with control systems. - Following an Australian government warning on an ICONICS Active X application, the US CERT issued an advisory. They didn't realize the vulnerability was strictly on the website demo. - DHS HITRAC stated that cyber was not included because there were no previous incidents in the Electric Power Sector (since the beginning of the Suspicious Activity Assessments).  This was not because there were no cyber events, but because none were reported to DHS. I am also concerned about accusing very reputable control system companies of covering up vulnerabilities. Following the Knoxville Control System Cyber security Workshop, I had a utility volunteer to serve as a test bed for evaluating control system cyber security and networking technologies. The test bed would subscription-based as we need to pay for equipment and staffing. However, unlike the test beds at the National Labs with the non-disclosure requirements, all evaluation information would be made available to test bed participants. I would be happy to provide information on this proposed effort to anyone interested. The way that the cybersecurity establishment has presented the Wonderware disclosure on the Digital Bond website clearly shows the lack of control system expertise in the cybersecurity "industry." It IS an industry, and it is filled with people from IT security and cryptographic analysis backgrounds who have rarely, if ever, set foot in a control room for a process plant, refinery, or power plant. It isn't enough to be able to understand a vulnerability. It is every bit as important to understand the relative danger of the vulnerability IN CONTROL SYSTEMS. For example, the Wonderware disclosure isn't very dangerous. Why not? Because the vulnerability disclosed is limited to a very small population of control systems using an outdated version of the Wonderware software. Like the ICONICS issue, revealing a vulnerability without a corresponding assessment of its impact is not only detrimental, but could be viewed (and certainly would be by Wonderware and ICONICS, for example) as unnecessarily injurious to their brands. These examples are clearly due to a lack of control system experience. Cybersecurity experience in IT or enterprise data centers, or government does not necessarily translate completely to control system cybersecurity. This is not to say that there aren't control system cybersecurity experts-- there are, and quite a few. They are the ones that government and industry need to coopt for blue ribbon panels, cybersecurity tests, and developing strategies. Joe Weiss  
Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

  • <p>Hi Joe and Walt password and login systems working, many thanks.</p> <p>I will endeavor to focus on the intent of what you are saying and put the other comments elsewhere about the passionate discussions for another time (over a beer and a laugh perhaps)</p> <p> Reviewing Disclosure Models for the controls space is an important issue and there are many aspects to work through in my opinion, especially if you are trying to achieve meaningful acceptance and uptake, We all know about stakeholder uptake and this applies in every context especially with those we work with or consider our piers and colleagues in the industry. It is the fair and reasonable thing to do, I am certain we all agree.</p> <p>The method for public disclosure and the timing of such is very much an issue for end users and vendor alike. Perspectives may be different, however I don't see any issue with the disclosure point being US-CERT and affiliates. The issues are largely based on effective communications in an environment that fosters a sense of trust and comfort and is effective and responsive.</p> <p>There has to be an undertaking to the extent of an industry charter perhaps between government, vendors and end users alike.</p> <p>This means we have to encourage discussion and vigorous debate on the issues and work out a method that is based upon a consensus of opinion.</p> <p> I am somewhat biased in that I think we have a pretty good foundation for handling this and a few other aspects around Business Continuity Management in this space in Australia. I believe the approach used here is scalable and more proactive and self feeding given a bit more time.</p> <p>The challenge is that this requires a huge shift in global culture and in methodologies in some parts of the world to bring about lasting and more effective change. This has to occur regardless of the motivation that is prompting it.</p> <p>As I have indicated to you previously Joe I think your intent and purpose and motivation behind your efforts is quite valid, Finding the way to encourage industry to change is where I think things are hitting the wall and I am of the opinion that things are at quite a serious fork in the road of life.</p> <p>It will continue, in my heart felt belief, to be like climbing a smooth wall to the moon by hand without a rope until we can take the changes needed to fix our social concerns and move them into a choice of life style rather than a burden to business. This is not to say that it will be an instant fix silver bullet.</p> <p>I see so much money, energy and valuable and talented human resources being wasted that could be re-directed in partnering approaches to the issues. Visualise for a moment if the huge regulatory framework and associated processes and committees and intermediary efforts were "re-assigned" or re focussed into a true partnership relationship between government and industry Imagine how effective and quickly things could be achieved. Look at situations where partnering achieves outcomes and where similar instances using the adversarial model were completely successful.</p> <p>That does not mean that the great work of the various standards are wasted or we throw out all regulation just how these instruments or tools are used to motivate people is different.</p>

    Reply

RSS feed for comments on this page | RSS feed for all comments