Industrial control systems (ICSs) were designed for reliability and safety and to enable system operability and functionality. Many ICSs were originally designed before networking was commonplace. Consequently, cyber security was not a design consideration. There actually were many design features that would enable the systems to be more operator-friendly and functional, but with networking these features can be exploited and turned into vulnerabilities.
These ICS design features (or potential cyber vulnerabilities) have been part of ICS devices since they were designed and installed and will likely be there until they are retired independent of cyber security considerations. Such specific features could include hard-coded default passwords to allow vendors to remotely monitor devices to build mean-time-to-failure databases. Features could be "backdoors" to enable the vendors to support the end-users when they (the end users) are locked out of their systems. Today, features include use of IPads, cell-phones, and other smart portable devices to remotely monitor, or at times even control the systems. To the IT community, these types of features seem incomprehensible because of the apparent security lapses these features create. Whereas, to many ICS operators, these features can be comforting knowing the vendor is supporting them and the vendors are responding to their requests for more operational flexibility.
IT cyber security practitioners are currently focused on the Advanced Persistent Threats (APT) which is some form of sophisticated malware "hidden" on the IT networks. It is not known to what extent APTs have compromised ICS networks since there is minimal monitoring of control system networks. In fact, I have seen actual "outside hits" on only two control system networks.
Aurora and Stuxnet demonstrate how design features of systems can be compromised and lie unknown for years. In the case of Aurora, it was the lack of protection for intentional out-of-phase conditions that has been part of the design of the grid for decades and demonstrated in the INL test in 2007. Yet to this day, there are almost no hardware mitigations installed, even though they would be inexpensive. For Stuxnet, it was a compromise of design features in the Programmable Logic Controller (PLC). In fact, recent information found a version of Stuxnet that was compiled in 2007. This demonstrates that Stuxnet was in the wild years before it was "discovered" in 2010.
Given the long time duration to detect the design features that become vulnerabilities and the long time to replace these large pieces of equipment when they are damaged (assuming they are even available-turbines have a typical two to three year backlog), this does not bode well for securing the critical infrastructures.