ExxonMobil, Honeywell bond over enterprise-level interoperability
Just as I/O, controllers, HMIs, networks and servers are working towards interoperability, enterprise-level systems and software want to gain similar efficiencies and benefits, too. This is why some of the largest oil, gas and chemical companies and related organizations are participating in the Open Asset Digital Twin (OADT) working group organized by ARC Advisory Group.
Michel Teughels, strategic manager for process operations and quality management at ExxonMobil, described the interoperability capabilities and requirements that OADT is striving for, and presented a progress report on its efforts this week at Honeywell Users Group 2025 EMEA at the Hague in the Netherlands.
“Two weeks ago, ExxonMobill announced plans to merge its upstream and downstream operations, and adopt processes that are more globally standardized,” said Teughels. “This is a challenge because most upstream applications and facilities usually operate as little kingdoms with their own needs. However, they need to standardize on one appropriate methodology and seek interoperability because subsequent implementation projects, adjustments, maintenance and updates will be more efficient and go much quicker.”
Some layers of history
Located at Level 4 and above of the seven-layer Purdue reference model for industrial control systems, enterprise technologies include manufacturing execution systems (MES), enterprise resource planning (ERP) software, and other IT-based business and planning solutions.
Layer 0 through Level 3 consists of sensors, I/O, PLCs, SCADA systems and servers. Their technologies are arguably closest to achieving genuine interoperability because many end-users and supplies have been developing and testing the Open Process Automation Standard (O-PAS) for the past 10 years.
In fact, ExxonMobil’s open process automation (OPA) Lighthouse project started operations a year ago as the first commercial instance of an OPA-guided distributed control architecture at its resin-finishing plant in Baton Rouge, La. It manages about 100 control loops and 1,000 I/O, and began making product and generating revenue on Nov. 18, 2024.
Reasoning the interoperability need
Teughels reported that ExxonMobil has partnered with Honeywell Process Solutions for 20 years, so they’re used to working with each other. This relationship also helped them investigate interoperability for close to two years, and collaborate closely with Shell, Chevron and OADT’s other asset owner-operator participants. The working group is scheduled to publish its business principles and requirements this month
To detail the enterprise-level interoperability OADT is seeking, Teughels used the example of a typical heat exchanger. Oil and gas, chemical and other process industries operate thousands of heat exchangers worldwide, so they’re a good representation of present realities, such as:
- Data captured in countless different models, or represented through limits and boundaries.
- Analytics combining asset and process data.
- Highly diverse personas and use cases driving diversity in requirements, such as accuracy.
- Lack of integration driving cost and complexity.
- Many use cases that aren’t cost justifiable.
- Unrealistic intellectual property (IP) protection because security models are defined by each application.
“Heat exchangers run for a certain time, and then they need to be cleaned, which is costly and also requires lots of documentation,” explained Tenghels. “Users expect to get back exchangers that will perform better, but they’re often not checked or measured to see of their performance has improved. This would be a good idea, but there are many different users involved with different reasons for tapping into their data.
“Many users acknowledge they could save lots of money ‘if we could only’ complete a certain task. However, software, licensing and even bigger asset integration expenses get in the way. So, users spend millions of dollars doing the same procedures over and over, and benefits decrease even as costs go up. Software and devices also come with security models that define who can use them and who can’t, and so we end up with binary security models that are also costly.”
Seeking a digital-twin data reuse model
To reuse data in its own enterprise ecosystems, Teughels reported that ExxonMobil employs three basic strategies:
- Single platform with internal components that can connect, while other technology platforms are inaccessible, which delays potential innovations.
- Multiple platforms with different databases each used to visualize the same information, enabling innovation but also introducing added complexity.
- Labor-intensive custom conversions by vendors or third parties that still lose data, making innovation and scaling time-consuming and costly.
“The multiplatform strategy usually depends on one person relaying information between them, such as Excel sending data to other platforms and back. However, this can be very stressful for whoever is in the middle,” explained Teughels. “These three presently available models aren’t desirable, which is why we need a fourth option based on digital twins that we’re developing with ARC and OADT.”
Teughels added that the reality of ExxonMobil’s technical debt is that it has a legacy of systems with siloed data that are expensive to integrate, highly customized and complex. It aspires to unlock more of the value in processes and their data by:
- Accelerating the scaling of digital capabilities across assets and processes.
- Maximizing the value of data, and deploying AI at scale.
- Migrating technological risks, improving flexibility, and achieving faster time to value.
- Providing a scalable security model to help protect data and IP.
Consequently, OADT’s initial business principles and requirements consist of:
- Open, with data that’s accessible through open and non-proprietary standard interfaces.
- Agnostic, with information stored in human-readable formats, storge in open, non-proprietary data models or schemas.
- Data ownership that can control access at the data layer, including out-of-the-box replications of the security model across third-party applications and at full scale. This includes one reference repository for data, where the security model is applied.
- Interoperability, with data stored in a central location that’s separated from the application. This solution likewise needs to integrate asset configuration data out of the box across third parties and non-proprietary standards.
- Transferability, with data separated from applications, requiring the ability to move information at full granularity across different environments, as well as replace applications with different third-party solutions. Also, coverage of all data and meta export in the license, so there’s no loss of information during the export-import process.
“Each user wants openness and interoperability that’s just for them,” added Teughels. “However, we need an overall migration from a process-based digital twin to an asset-based digital twin because of the need to manage both assets and processes. Our vision is an open, interoperable, multivendor data platform with open standards that foster collaboration, flexibility and innovation; vendor-agnostic solutions that include the flexibility to work with already installed databases; and IP protection that enables granularity with a scaled security model. More specifically, this includes data distribution and configurable use cases, deployment methodologies, and an architecture that covers overall solution and data products.”

