The umbrella term “ICS” includes:
- automated control systems (ACS)
- distributed control systems (DCS),
- programmable logic controllers (PLC),
- supervisory control and data acquisition (SCADA) systems,
- intelligent electronically operated field devices, such as valves, controllers, instrumentation
- intelligent meters and other aspects of the Smart Grid
- networked-computing systems
An ICS is actually a system of systems. A crude distinction between mainstream IT and control systems is that IT uses “physics to manipulate data” while an ICS uses “data to manipulate physics.” The potential consequences from compromising an ICS can be devastating to public health and safety, national security, and the economy. Compromised ICS systems can, and have, led to extensive cascading power outages, dangerous toxic chemical releases, and explosions. It is therefore important to implement an ICS with security controls that allow for reliable, safe, and flexible performance.
The design and operation of ICS and IT systems are different. Different staffs within an organization conceive and support each system. The IT designers are generally computer scientists skilled in the IT world. They view “the enemy of the IT system” as an attacker and design in extensive security checks and controls. The ICS designers are generally engineers skilled in the field the ICS is controlling. They view “the enemy of the ICS” not as an attacker, but rather system failure. Therefore the ICS design uses the “KISS” principle (keep it simple stupid) intentionally making systems idiot-proof. This approach results in very reliable but paradoxically, cyber-vulnerable systems. Moreover, the need for reliable, safe, flexible performance precludes legacy ICS from being fully secured, in part because of limited computing resources. This results in trade-off conflicts between performance/safety and security. These differences in fundamental approaches lead to conflicting technical, cultural, and operational differences between ICS and IT that need addressing.
CIA Triad Model – Confidentiality, Availability, and Integrity:
- Confidentiality describes how the system or data is accessed
- Integrity describes the accuracy or completeness of the data
- Availability describes the reliability of accessing the system or data
Traditional IT systems employ the best practices associated with “Confidentiality, Integrity, Availability” (CIA) triad model – in that order of importance. The placement of rigorous end user access controls and additional data encryption processes provide confidentiality for critical information.
Traditional ICS systems employ the best practices associated with “Confidentiality, Integrity, Availability” (CIA) triad model – in the reverse order; AIC- Availability, Integrity, Confidentiality. Extra emphasis is placed on availability and message integrity.
The converged ICS/IT model would employ the best practices associated with “Confidentiality, Integrity, Availability” (CIA) triad model – in an equally balanced way. The compromise of any of the triad will cause the system to fail and become unusable.
It is important to point out another major difference between IT and ICS systems. In an IT system, the end user generally is a person, in an ICS system the end user generally is a computer or other highly intelligent control device. This distinction lies at the heart of the issue around securing an ICS in a manner appropriate to current need.
IT systems strive to consolidate and centralize to achieve an economy of scale to lower operational costs for the IT system. ICS systems by necessity are distributed systems that insure the availability and reliability of the ICS and the systems that the ICS controls. This means that remote access is often available directly from field devices reducing the effectiveness of firewalls at the Central Demilitarized Zone (DMZ) and requiring additional protection at remote locations. The limited computer processing power in the field devices precludes use of many computer resource-intensive IT security technologies such as remote authentication servers. Newer ICS designs do, or will, employ advanced high-speed data networking technologies. Thus, what used to be a single attack vector (the host) increases by the number of smart field devices (Intelligent Electronic Devices [IED], smart transmitters, smart drives, etc.).
The use of mainstream operating system environments such as Windows, UNIX, and Linux for running ICS applications leave them just as vulnerable as IT systems. While at the same time, the application of mainstream IT security technical solutions and/or methods will help to secure more modern ICS host computers and operator consoles (i.e., PCs). In technologies such as Virtual Private Networks (VPN) used to secure communications to and from ICS networks, IT security focuses on the strength of the encryption algorithm, while ICS security focuses on what goes into the VPN. An example of this concern was demonstrated by one of the Department of Energy’s National Laboratories of how a hacker can manipulate widely used “middleware” software running on current mainstream computer systems without a great deal of difficulty. In this sobering demonstration, using vulnerabilities in OPC code (“OLE for Process Control”), the system appears to be functioning properly even though it is not; while displaying incorrect information on, or withholding correct information from, system operator consoles.
Certain mainstream IT security technologies adversely affect the operation of ICS, such as having components freeze-up while using port scanning tools or block encryption slowing down control system operation - basic Denial of Service (DOS). IT systems are “best effort” in that they get the task complete when they get the task completed. ICS systems are “deterministic” in that they must do it NOW and cannot wait for later as that will be too late.