Emerson-Dave-2014-80x90

The future of automation systems

Oct. 11, 2019
After years of lagging, automation systems are adopting the lessons learned from IT.

The core building blocks of today’s industrial automation systems are basically the same as they were when the first distributed control systems (DCSs) and programmable logic controllers (PLCs) were developed in the 1970s: controllers, I/O, operator consoles, engineering stations and a network. When applied to a system architecture design, each of these fundamental elements are well understood by industry professionals.

Early DCS and PLC automation system suppliers delivered basic operations technology (OT) performance and reliability as required for the process industries using proprietary and low-level basic technology. In the 1970s and 1980s, these automation systems met performance and safety requirements when off-the-shelf information technology (IT) equipment could not.

Over the ensuing decades, IT advanced at an accelerated pace, fueled by larger commercial markets that were encouraging competition and driving down costs. By the 1990s and 2000s, it was clear that IT was surpassing OT in functionality, cost savings and technological improvements.

During this time, the familiar OT building blocks transitioned from purely proprietary technology to hardware and software leveraging the faster-moving IT sector. One of the early moves occurred in the 1980s when the first third-party graphics boards, such as those from Matrox, became available and replaced automation suppliers’ proprietary graphics circuitry. The increased functionality and lower cost of these commercial products couldn’t be matched by automation suppliers’ proprietary offerings, so they were adopted. This trend continued for many other automation system elements, but not the last holdout: real-time controllers and I/O systems, primarily PLCs and DCSs.

Another relevant factor today is the rapidly increasing availability of IT-centric industrial internet of things (IIoT) devices, and end-user demands to use them effectively. The IIoT frenzy quickly brought forth first-generation edge devices, along with industry groups focused on standardizing edge-device architectures. More sophisticated edge devices continue to come to market, with increasingly advanced computing, control and analytical capabilities overlapping more and more with traditional automation systems.

The confluence of IIoT and challenges laid down by end users prompted one very large process company to consider how OT automation systems could be reimagined using commercial, off-the-shelf (COTS) IT components. Other process companies have joined the effort, along with many key automation suppliers. End user requirements include:

  • Incorporating best-in-class COTS hardware and software to create automation systems surpassing today’s DCSs in reliability, security and end user value,
  • Enabling end users to preserve their control strategies by porting them into upgraded or totally new systems,
  • Modularizing hardware elements such as computing, networking, storage and I/O terminations to allow the incremental upgrades, and
  • Decoupling software from the hardware and I/O so it can run anywhere in the system.

These efforts have started to transform DCS and PLC controllers, as witnessed in open controllers suitable for the process industries coming to market, and with the work of industry groups like the Open Group Open Process Automation Forum (OPAF). OPAF is working to meet the industry challenges, partly by leveraging work done in the IIoT cloud markets.

One outcome of this will be the decoupling of computer hardware used for control from the software performing the control functions. This will enable radically different automation system architectures to be created using a small number of COTS IT hardware and software building blocks, which will host a new generation of state-of-the-art automation software.

Designing automation systems will become less physical and more abstract as the focus shifts to identifying required automation software functions, which may include continuous process control, batch process control, SCADA, IIoT or other needs. After this activity, hardware could be deployed in various ways as needed.

Here are some examples of different architectures possible based upon OPAF work:

  • Highly distributed control system: Instead of monitoring and controlling thousands of I/O with a single controller, the workload could be distributed over hundreds of distributed controllers. I/O signals, whether hardwired 4-20 mA or digital communications, could enter the system anywhere and be made reliably available globally for any controller by using time-sensitive networking (TSN) and standardized communication stacks like OPC UA. The ability to place I/O and control functions anywhere it makes sense enables load balancing, failure resiliency and easy expansion.
  • Unit-based control system: Building upon the previous concept, this approach calls for a highly reliable, field-mounted edge device accessing and controlling I/O for a single process unit or major piece of equipment. Computing power could be supplemented by adding another controller nearby or anywhere to share the load.
  • Centralized control system: A sophisticated on-premise server running cloud software virtualizing the processing, storage and networking functions could host all control and analytical applications. All I/O would be supplied using field-mounted devices with sufficient compute and communications power to provide low-latency access by the server. A centralized and virtualized solution could allow upgrades with no disruption to operations. This architecture has similarities to how many telecom companies handle internet and voice-over-internet-protocol traffic today.

Each of these very different architectures offers varying strengths, weaknesses, resiliency and end-user value. But, all three and more can be created using a small number of hardware and software technology building block components conforming to the O-PAS standards, which OPAF is working to develop. Which architecture is appropriate for any one end user will be their decision, however suppliers will have the flexible means to deliver custom-architected systems providing lifecycle savings and other benefits using a small set of technology and automation building blocks.

This is the future of automation systems.

About the author

Dave Emerson is Vice President of Yokogawa’s U.S. Technology Center in Dallas, Texas. He is experienced in applying and developing automation systems used in the process industries. Dave is an ISA Fellow and a member of Control Magazine’s Process Automation Hall of Fame. Dave has over 30 years’ experience participating on U.S. and international consensus standards committees such as ISA-88, ISA-95, IEC TC65 and ISO TC184. He also represents Yokogawa in several industry groups including leadership roles with MESA, MIMOSA and the OPC Foundation. Dave is Yokogawa’s primary representative with OPAF and currently serves as Co-Chair of OPAF’s Enterprise Architecture Working Group. He can be reached at [email protected].

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...