Breaking the interoperability barrier

March 21, 2017
ExxonMobil, Lockheed Martin and friends lead charge to open, secure, interoperable process control system

We all knew this was coming—even if it took a few decades. Change can be denied and resisted for a long time, but eventually pressure builds, tectonic plates slip, volcanoes erupt, ice shelves crack, and suddenly the world is different.

In the process industries, end users face many similar forces: end-of-life and obsolete equipment and facilities, increasingly mega and complex projects with ever-tightening deadlines, and unfortunately, some control suppliers unwilling to provide interoperable components and networking. Users have coped with these occupational hazards for decades, of course, but now they're compounded by tightening margins due to reduced energy prices from fracking and plentiful supplies of natural gas and oil.

Plus, like everyone else, process users see increasingly cheap and powerful microprocessors, software, Ethernet, wireless and Internet technologies proliferate in consumer smart phones and tablet PCs, watch them enable mainstream, IT-based, enterprise applications, and justifiably ask why similar tools and efficiency gains aren't nearly so prevalent in process applications.

No surprise, a few users have finally had enough. They report their projects, operations and customers can't afford to coddle and be hamstrung by old, cumbersome, labor-intensive, time-sucking process controls.

“A lot of our systems are becoming obsolete, and we need to replace them and continue to add value. Traditional DCSs weren't solving our business problems, so in 2010, we began an R&D program and in 2014, we developed functional characteristics we could take to the process industry,” says Don Bartusiak, chief process control engineer at ExxonMobil Research and Engineering Co. (EMRE). “Our vision is a standards-based, open, secure, interoperable process automation architecture, and we want to have instances of the system available for on-process use by 2021.”

Julie Smith, global automation and process control leader in DuPont's process control consulting division, adds that, “DuPont is business-centric, but has a decentralized manufacturing and engineering structure, so our role is technical stewardship. We recommend the best, fastest, most cost-effective equipment we can find on the best platforms, and do lots of dynamic modeling and simulation. As a result, we're excited about the open process automation initiative. It's been needed for quite awhile.”

Open control basics

About a year ago, Exxon hired Lockheed Martin as system integrator to oversee and coordinate development of an open, standardized, secure, interoperable process control system. Lockheed solicited requests for information (RFI), received 53 proposals from suppliers, and began to build a database of who's capable in which technical areas. Following its studies and scope work this past year, it sent request for proposals (RFPs) to 82 suppliers, including the originals and others identified in an open call. These proposals for a proof of concept (PoC) prototype on Exxon's open control system were due back on Feb. 13 to Lockheed, which is scheduled to deliver its PoC prototype in fourth quarter 2017 (4Q17). [sidebar id =1]

To encourage other users, system integrators and suppliers to participate in developing and implementing the new open control system, ExxonMobil and Lockheed Martin have also spent the past year working with the Open Group to form the Open Process Automation (OPA) Forum, which is billed as “an international forum of end users, system integrators, suppliers, integrated DCS suppliers, academia and other standards organizations working together to develop a standards-based, open, secure and interoperable process control architecture.”

The Open Group is a global consortium that helps members of its forums achieve their business objectives with information technology (IT) standards. For example, Lockheed has participated in the Open Group's Future Airborne Capability Environment (FACE) Consortium, which is a gathering of avionics manufacturers formed in 2010 to create an open avionics standard for making military computing operations more robust, interoperable, portable and secure. FACE has been an inspiration and model for what OPA hopes to accomplish.

“Openness is about more than interoperable technologies. It's about improving relationships between people and between enterprises, and making the whole greater than the sum,” adds Steve Bitar, program lead for ExxonMobil's open architecture initiative, who spoke at the ARC Advisory Group Industry Forum on Feb. 6-9 in Orlando, Fla. “It's compelling to believe all components should be modular and open, but in practice, other factors may be deemed more important than openness and modularity alone, and a risk-based analysis, using new technologies and comprehensive testing can help determine which components should remain tightly coupled. The question is, what can we break up, but still ensure reliability? One of the main reasons we're pursuing an avionics model is because those systems also connect hundreds of devices in standardized ways, so it's easy to make sure they're safe before takeoff.”

Once OPA defines its business framework, it will begin to pick and choose standards—including considering those already available for networking and controls—for its interoperable system, and then draft conformance certifications for its open components. Following its launch at the ARC conference, OPA is scheduled to:

  • Provide a business guide about its standards effort in 2Q17;
  • Release OPA standard, version 1, in 1Q18;
  • Start a conformance certification program in 3Q18; and
  • Released OPC standard, version 2 in mid-2019.

Lessen the layers

“Our effort is inspired by avionics and military aviation because they've successfully transitioned from customized systems to open and interoperable ones,” adds Bartusiak, who led presentations and panel discussions on Exxon, Lockheed and the Open Group's open systems initiative at the ARC event. “We also see this as a way to use virtual technologies to allow our control systems to be different from the seven-layer Purdue control hierarchy model that began to be developed in the 1970s. For example, the new, open control system will build in security, and have wireless, cloud and Internet of Things (IoT) connections.”

Referenced and applied in several ways for different applications, the seven-layer Open System Interconnect (OSI) Model for Control Hierarchy was developed by the International Standards Organization. It's served as a conceptual framework for controls, networking, enterprise and security strategies and standards, such as ISA-95 for control/enterprise interfacing and ISA99 for cybersecurity, and more recently been joined by a simpler five-layer model for IT and Internet applications.

“If we're controlling temperature and pressure in a process, those components are typically networked on Level 3 of the Purdue model, but that means only Level 3 is reusable. We want to get rid of that hurdle, and get application portability at Level 2,” explains DuPont's Smith. “Likewise, we struggle with devices like process heaters that usually have a multivariable process control (MPC) on Level 3 as well. They work OK for awhile, but they're also prone to wear, require updates, and get broken communication links that need patching. These features can be hard to get back, and that's why they often fall into disuse. We could redesign these solutions to put in a different DCS, but if we could simply get away from using a proprietary DCS, then we could make advanced control a lot more portable, too.”

Bitar adds that Exxon and OPA also want to migrate from the usual, vertical hierarchy of sensors, controls, operations management, business planning and services to a flatter, simpler architecture with more “democratic” devices participating jointly in their real-time service bus network, and polled as needed by controls, operations, business and service functions. “This allows decoupling the sensor or other data producer from its consumers, enables configurable quality of service with no re-provisioning, reduces sensor integration costs, and improves data bandwidth,” he says. “We just need to get out of the century-old paradigm that each single-loop controller can only support one sensor and one actuator. Nothing binds us to this single-loop archetype—everything can work everything else now—but we can't seem to get out of the idea of this algorithmic pairing of one, single input and one, single output for our systems, even though there are better ways to control highly interactive and dynamic processes.”

Gene Tung, IT division lead for Merck & Co.'s vaccine manufacturing plants, adds that, “We have a lot of legacy process equipment, suppliers we rely on, and a corporate DCS standard. However, there's still a lot of variety in our process controls so we use about 50% outsource and 50% in-house experts, and we'd all like to see more standards and languages.”

Waking up from history

Like any big shift, the interoperable process control movement doesn't come from nowhere. Irritation due to lacking interoperability is an old problem, but it's been an unavoidable cost of doing business that users traditionally just had to live with. There have been many efforts to create greater openness and interoperability in process controls and networks, and though some pushed the needle on openness, they all fell short of actual, plug-and-play interoperability. Even common Ethernet cabling couldn't make the proprietary protocol languages talk to each other. Protocol-converting modules, software, communication strategies like OPC-UA and Internet-aided data transfers have helped, but plug-and-play control still seems out of reach to most end users.

“This problem goes back to before the Foundation fieldbus (FF) protocol (ISA/50 SP) began in 1985. That project started as a way to fix a problem that Exxon Chemical was having—whenever they had a new DCS to put in, they were on the hook to just one supplier, had no choice but to use it, and felt like they couldn't seek competitive bids. Many of those constraints are still in place today, whether by software, training or devices on Ethernet that can't interoperate,” says Dick Caro, CEO at consultant CMC Associates.

“The idea for FF was to put more intelligence in the field instruments, and even put process control in the field as an alternative to a DCS. It worked, and the first FF H1 devices hit the market in 1997,” explains Caro. “FF High-Speed Ethernet (HSE) was developed next with a full protocol stack and host compatibility testing, but suppliers didn't find enough demand for it. Only ABB and Yokogawa implemented FF HSE and passed host compatibility testing. Most suppliers just kept installing FF H1, and terminated at the control room with the I/O count on the termination card. The problem was that if a user wanted to implement HSE, then everything could come back through one Ethernet port, and the supplier would lose all the revenue from the I/O equipment they'd been selling before.” 

[sidebar id =5]

Caro adds that ExxonMobil has several hundred DCSs in the field, including some from the 1970s through the 1990s, but replacing them with more proprietary DCSs would have cost billions, so they began seeking to put their I/O in the plug-and-play category. “We had a project team that met during 2014-15 to find a DCS replacement, and hoped that someone would come up with a solution that would met Exxon's requirements, but nothing happened,” he says. “The suppliers just talked about how great their existing products were. That's when Exxon began developing its open systems vision and reference architecture in 2016, and started seeking a way to build it.”

Caro reports the unique part of Exxon's reference architecture is its patent-pending distributed control nodes (DCN), which will be able to take a 4-20 mA or HART signal from an instrument, perform single-point data substitutions, add FF H1 function block capability, and use an internal analog-to-digital (A/D) converter to have that signal come out speaking an open Ethernet protocol. “There are millions of perfectly good HART devices in the field, and these DCNs will be able to interface with them and 4-20 mA,” adds Caro. “The joy of DCN is that it enables FF functions to be installed and used without ripping and replacing instruments. You just need to intercept the wire, and the rest is software. Plus, the other magic is that a supplier doesn't have to invest in new software; if they want to build a DCN, they can use the FF software that's already in their instrument.

“What this boils down to is that suppliers will have to do FF HSE and perform host compatibility testing, and build inexpensive DCN hardware that's industrially protected. Also, if process devices are configured with FF as their base logic and programmed with FF software, then everyone will be doing it the same. This will make all these components far more interchangeable, and let users demand competitive bidding because everyone will conform to the FF interface. This has been the dream for a long time.”

OPA nuts and bolts

Bartusiak reports that OPA's vision for its open, secure, interoperable, standards-based process control architecture consists of nine primary characteristics. The three most important are:

  • Conformant components for systems that are fit-for-purpose for end users’ needs and low-cost to integrate;
  • Adaptable intrinsic security; and
  • Market expansion opportunities for suppliers' and system integrators' components and services.

The remaining six characteristics are:

  • Best-in-class components that can deliver timely access to leading-edge capability and performance;
  • Commercially available solutions that are applicable to multiple industry sectors;
  • Protects suppliers' intellectual property within conformant components;
  • Enables portability and preservation of end user's application software;
  • Simplifies making future replacements and reduces system lifecycle costs; and
  • Promotes innovation and value creation.

Overall, OPA's scope will include traditional distributed control systems (DCS) and their I/O, programmable logic controllers (PLC) and their I/O, human-machine interfaces (HMI), advanced controls and manufacturing execution systems (MES). Its jurisdiction doesn't include field sensors, valves, actuators and other plant equipment, or safety instrumented systems (SIS) or their I/O, or business systems.

All nine ingredients in OPA's vision fit into its reference architecture for its interoperable controls and network, which has been used by participating developers, suppliers, candidates and other contributors to draft their RFIs and RFPs (Figure 1). The three new, innovative parts expected to provide openness and interoperability are:

  • Real-time advanced computing (RTAC) platform, which is the OPA architecture's controller;
  • Real-time, universal service bus from which all applications can draw data. This network will include OPA's standardized communication protocol. Developers are also researching how to give it built-in cybersecurity; and
  • Distributed control node (DCN) configurable I/O for input/output processing, regulatory control, logic solving and application hosting.

  [sidebar id =2]

“The configurable I/O and RTAC are using software and virtualized computing to define their network, , which will let users employ commodity hardware, but still meet their need for upper-level services,” explains Bartusiak, who reports that one of OPA's main goals to get as many end users and system integrators to join as possible. “This industry initiative isn't just ExxonMobil. We're trying to calibrate everyone to the same page. We want to have multiple-thread efforts to prove the open process automation concept. What's unique about OPA's initiative is that its business framework allows participants to learn. So this is really a call to action. If you're a process control and automation end user or a system integrator, we need you.”

So far, OPA's membership consists of:

  • Nine end users, including Aramco Services, BASF, Chevron, Dow Chemical, ExxonMobil, Koch Industries, Merck, Praxair and Shell;
  • Five (and soon, maybe seven) DCS vendors, including ABB, Emerson, Honeywell, Schneider Electric and Yokogawa;
  • Three DCS-adjacent suppliers, including GE, nxtControl and Siemens;
  • Eight hardware suppliers, including Cirrus Link, Cisco, Curtis-Wright, Hewlett Packard, Huawei, IBM, Intel (Wind River) and Relcom;
  • Five software suppliers, including AspenTech, Inductive Automation, Mocana, Process Systems Enterprise and RTI;
  • Three other suppliers, including ATE Enterprises, Conexiam and Mitre;
  • Four system integrators, including Accenture, Lockheed Martin, Radix and Tata Consulting; and
  • One other organization, ARC Advisory Group.

Along with involving more users and other participants in its organizing and development process, Bartusiak adds that OPA expects to ensure accountability for the performance of its open, interoperable system with help from FACE's successful procurement method for faster, cheaper solution delivery and implementation. It also plans to rely on the experience of the 500 members of the Control System Integrators Association.

“Six or seven of the eight major DCS suppliers have already joined OPA, so all the users and other potential participants have to get with the program, too. Most want to be part of the efforts to change and achieve interoperability,” adds Bitar. “Proprietary or openness is a choice. In fact, U.S. Dept. of Defense (DoD) contractors went through this same process in 2008-11. The avionics suppliers were told they had to change, and there was a lot of fear and doubt. It was a huge disruption to change to a more standardized model, but now most of them say they wouldn't go back to their old model. The old, proprietary way seeks to secure markets and lock out competition, while the new, interoperable strategy seeks to broaden the market, and let suppliers build what each is really good at.”

Responses and promises

While some control suppliers have been notably silent, or are still formulating a response, following Exxon and OPA's call for interoperable controls, others say the push for interoperability is an opportunity for them.

“We've worked with ExxonMobil for along time, and they and other end users are faced with replacing huge amounts of old DCSs, but there's no longer a benefit in modernizing with new versions of old equipment. The DCS has become a bottleneck in many cases,” says David Barnes, global strategic sales leader at Yokogawa Electric Corp., which joined OPA in November and submitted an RFI and RFP to Exxon and Lockheed's open system prototype project. “We've been pursuing these threads independently for awhile, and developed our Agile Project EXecution (APEX) program to help users and OEMs integrate equipment from all suppliers. We have a choice: adjust and change, or stay stuck in cement until the world passes us by. We view OPA as a chance to meet the needs of our industry that created the need for the Industrial Internet of Things (IIoT) in the first place.”

Dave Emerson, director of Yokogawa's U.S. technology center, adds, “The influx of IT into process automation is obviously increasing, and just like when so much software moved from Unix to Windows, the same forces are impacting DCSs. They need openness and gateways from edge devices to get data to the cloud quicker. However, one especially important task will be to uncouple application software, such as configurations, function blocks, control strategies and DCS graphics, and separate it from the technical architecture, such as operating systems, network technology and hardware. This will allow the technical architecture and its devices to be refreshed without having to import new application software.”

Meanwhile, OPA's other co-chair, Trevor Cusworth, reports that, “Participating will help us all come to a standard we can use. The better participation we get, the better our chance of success will be.” Cusworth is also global account manager for Schneider Electric's industry business.

Dr. Peter Martin, vice president of business value consulting and Edison master at Schneider Electric, adds that, “The world is changing, and we need to look at leveraging larger prizes than we have in the past. We have to look beyond what we can sell next week, and go much more towards serving customers in the long run—just as more open and interoperable technologies still need to focus on driving value to the bottom line. There's no question that this going to be a tough transition, but there's also no question that it's absolutely necessary.”

How suppliers can win

To get more players involved in OPA and interoperability, Bartusiak reports there are three mechanisms suppliers can use to succeed in looming, less-proprietary, more open, interoperable control environments and markets.

“The first is reducing systemic costs by considering the total expenses of delivering input and output signals and data,” explains Bartusiak. “Users want to land their field wires and have their signals convert to an industry-standard, digital protocol that can be software-assigned to their computer on the control layer. There's no sense in continuing to have A, B and C flavors. Doing this could also help take a lot of cost out of the supply chain.”

Bartusiak adds the second way for suppliers to succeed with OPA's interoperability is to increase their margins by specializing on differentiating their own technical advantages. “This involves how they manage their software namespaces, and how to do dynamic memory allocation, while adding or deleting points and executables,” he says. “This will allow suppliers to do more of what they do best.”

Third, Bartusiak argues that suppliers can increase revenue in the interoperable era by striving to expand the overall process control and automation market into new manufacturing and industrial areas where it hasn't served before. “We need to grow the pie,” he adds. “Process engineering concepts, such as feedback, can be applied to new areas like planning, scheduling and others.

“Much of this is like we're still in the days when different railroads had different track sizes, so cargo had to be moved to different cars in railroad yards. At the same time, we're all seeing how easy it is to use third-party apps on our smart phones and tablet PCs, and we want to know why we can't get these functions in our world. We all have to change, however, in the history of standards development, the key is having end users actively involved in the sausage-making. That's why we need end users to join the OPA forum and actively participate.”

OPA aided by Open Group

One reason why OPA's organizers are confident their interoperability quest will succeed is the support they're getting from the Open Group, which has 563 member organizations and 40,000 participants in 126 countries. The group has coordinated similar standards efforts in other industries such as security, IT, embedded systems, supply chain and software including Unix's platform base, standard evolution and product certification.

“Our goal is to bring people together by making workable standards that are driven by the needs of business users,” says David Lounsbury, CTO of the Open Group.

Following the group's guidance, OPA is organized into several subcommittees to work on different parts of its vision for an open, interoperable process control system. These working groups are led by a mix of end users and suppliers (Figure 2). More subcommittees can be added as needed, most likely to address future tasks like certification and component discovery. This is similar to how the FACE group is organized.

[sidebar id =3]

To develop its open architecture, OPA and its Enterprise Architecture (EA), Business and Technical working groups will use the Open Group Architecture Framework (TOGAF) architecture development method (ADM), which is a procedural tool for acceptance, production, use and maintenance of architectures. It's based on an iterative process model, which is supported by best practices and includes a reusable set of architectural assets. The three working groups will develop documents and figures for the interoperability architecture, and EA will work with them.

For example, EA will define business problems, model their environments, document objectives and KPIs, and identify business and technology actors who can solve them, document requirements, and refine as needed. Then TOGAF will use these business scenarios to make sure whole problems are understood, and can be related to business value.

Lounsbury adds one way OPA will simplify its own interoperability efforts is by researching, selecting, adopting and reusing existing standards and strategies that meet its requirements, such as ISA95/IEC 62264 for control system integration and IEC 61499 for function blocks. “There's no value in reinventing a working standard. We look to incorporate other standards where we can, and liaison with other standards bodies,” adds Lounsbury. “The Open Group has an extensive network of liaison agreements to facilitate cooperation, adoption and reuse.” 

[sidebar id =4]

Jim Hietela, VP of business development and security at the Open Group, adds that, “The process industries are at a whiteboard moment, and OPA is a great way to get involved, determine what its future will look like, and speak to the supplier community with one voice. This will also benefit suppliers because they'll be able to grow the market for DCS by expanding it to other industries.”

Exxon's Bitar adds, “The Open Group is our secret weapon. In the past, standards were written without thinking about an endgame. In this case, OPA isn't beginning with a standard, but is starting with business pain points and goals. This will mean fewer battles between standard details, and help us avoid falling into the same traps as the efforts to develop fieldbus and wireless standards.

“Whether they're value-, cost- or security-focused, all automation users can benefit from open systems, but the window of opportunity to replace a DCS only occurs once every 20 to 30 years. To compete in the future, industrial manufacturers will require more open and fluid flows of information across the IT/OT boundary via secure connections, so join us on this journey and join OPA. This interoperability effort applies to all the process industries that use DCS or SCADA systems, and hopefully it will encourage users to go back to their management and say this is a worthwhile investment that's worth participating in.” 

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.