Connections Count in Control

Increasing Connectivity Brought Risks, Primarily That the Control System, No Longer Isolated, Became Vulnerable to the Same Attacks as the Enterprise

By Ian Verhappen

With 20/20 hindsight, our progress  from pneumatics to wireless looks simple and logical. In the light of Control's look back into how far we've come in automation over the last 25 years, I thought I would focus on the developments in networking over that time. How did we get to where we are with network connectivity?

Control loops, as the name implies, rely on connecting inputs through a control algorithm to outputs to maintain a process at setpoint. As a consequence control has always been about connections. If anything, the degree of connectivity only continues to increase with time.

Initially, control loops were either "self-contained" or pneumatic, which limited the amount of connectivity possible between loops and certainly with the enterprise. The introduction of the electronic controller and control system made it easier to connect the controllers with the computers used to run the business or enterprise. About the same time, industry was adopting distributed control systems and PLCs. The Xerox PARC Ethernet—a contraction of ether and net(work)—was also being developed. Ten years later, this work became the international standard native equipment in practically all computers. In its initial applications, Ethernet was used for file and message transfer, as well as web page transmission. By the mid-1990s, it was the de facto default network for connecting business systems together, both internally and with other enterprises.

For economic reasons, the use of industrial communications protocols follows development of communications in the larger commercial environment. So, once it became apparent Ethernet was the default network in the IT world, manufacturers in the control sector, as part of the move to using commercial off-the-shelf and open systems to manage development costs, also moved from proprietary protocols to Ethernet and largely Windows-based schemes. With the adoption of Windows technology as a base, OPC Data Access (DA) v.2.0 quickly became the common interface that all protocols could use as a universal gateway. Developers no longer had to create gateways to connect between every device on a system, but could now develop a single gateway between the protocol for their device and OPC, greatly reducing the complexity of multi-vendor installations.

Realizing the need for connecting the evolving control and enterprise networks, ISA released its first standard (ISA-95) in 1995. It continues to evolve to incorporate the Purdue model for better integration between enterprise and control systems. Similar efforts by organizations such as World Batch Forum (WBF)/MESA, MIMOSA and others are implementing the ISA-95 model in different industry settings.

Metcalfe's law says, "The value of a network is proportional to the square of the number of connected users (nodes) of the system." But the complexity of the system also increases with the number of nodes; because we are limiting ourselves in the number of protocols this complexity is manageable. Obviously, today's tools make the benefits of the network connection far outweigh the incremental cost of adding a new node.

This increasing connectivity also came with risks, primarily that the control system was no longer isolated. It too became vulnerable to the same attacks as the enterprise; hence the need for the ISA-99 cybersecurity standards.

The past decade has seen the addition of wireless extensions based on the many permutations of IEEE 802.11 and 802.15.4 standards. Wireless systems are now common in the corporate world and gaining traction in industrial networks to extend them into applications not practicable using wired systems.

All these connections are made to transmit data. One challenge we as an industry now face is how to manage that data, and then convert it to knowledge for action. We will come back to this topic of asset management in March after seeing how networks have affected process analyzer systems, and what that might mean to future analytical capabilities.