Implications of the “Siemens” PLC/HMI/SCADA vulnerability
I have identified several lessons learned from the STUXNET.A/Siemens WinCC and Step7 issue. They include the need to understand how control systems work and the need for secure control systems by design.
The need to understand how control systems work is illustrated by the conflicting recommendations from Microsoft and Siemens.
Microsoft wants default passwords changed (standard IT policy) while Siemens is telling its customers NOT to change the default passwords as it could cause problems. The IT folks do not understand why anybody would want to keep a default or hardcoded password as an emergency back door. IT in enterprises, outside of banking, simply doesn’t have real-time emergencies. I wouldn’t be surprised to find some sort of accessible back door there as well.
Before control systems are changed including patches, the domain experts need to approve. This is not the first time that IT has tried to force office policies on control systems with bad results. The control system community uses computers for extended periods of time. The Microsoft advisory identifies older versions of Windows as being affected that Microsoft no longer supports. What should the end-users do when upgrades may not be possible? Formal patches may not be available, but others may be able to assist. This is an issue that end-users and their IT departments are generally ill-equipped to address.
In the control system environment, there are three considerations: performance, safety/reliability and security, in that order of importance. This is nearly inconceivable to classically trained IT security professionals. But it is the correct order in the real-time world…security cannot be allowed to compromise the performance, safety and reliability of the system.
In another example of the need to understand the mission, a utility engineer has been constrained by his IT and NERC CIP compliance team in terms of controlling remote access to the control system. That is, they will not allow him to access the control system through the firewall. Consequently, to get reports from his Siemens PLC, he is constrained to use a USB drive, download the report and “sneaker ware” it back to his office PC.
This is truly disconcerting for any number of reasons.
To secure anything, you need to know what it does and why it is there. Unfortunately, today, there is too often a reaction to define security without understanding what the control system is trying to accomplish (see above).
The concept of using default passwords burned into firmware did have a functional basis. Nevertheless, security was not a consideration. In this case, the need is to define a more secure approach to perform the function the hardcoded passwords were to provide.
To get around these problems, the initial design must include security. However, control system vendors are still slow to include security in the initial design. Moreover, most security is done by personnel who may not understand the mission of the control system. It is imperative to have the security (and forensics) community work with the control system domain experts who understand the mission of the control systems to develop appropriate security (and forensics).
It would be nice if we didn’t have to use COTS software with dozens of apps and their attendant vulnerabilities that we just don’t need. Maybe Microsoft would be willing to produce a “skinny” version of Windows for the control system industry that doesn’t have all the unnecessary bells and whistles.
And we still have to remember that the amount of security in a control system is based on an (often unwritten and sometimes unwilling) agreement between the vendors and the users to provide users with as much security as they are willing to pay for.