When industrial organizations decide to start a journey to incorporate AI into their operations they’re really setting out to transform how they work, not just to find a solution to a problem. For that reason, they must first define their reasons and address their apprehensions for using AI—feasibility, expectations and risks, in particular. In short, they must not only build a new operational model but also build trust throughout the organization.
Those were the baseline requirements described by a panel of end users in the oil and gas sector—Allison Paquette, facilities and operations of the future portfolio manager, Chevron (left in image); and Celeste LaBruyere, senior digital operations advisor, ExxonMobil (right in image)—and Honeywell’s chief commercial officer, Tathagata Basu, during Honeywell User Group 2025 in San Antonio, Texas. The session was moderated by Rich Karpinski, principal analyst, S&P Global Market Intelligence.
The panelists acknowledged an inherent skepticism about AI, which is even more pronounced when it comes to applying it to process control. So, before beginning, it’s important to establish a layer of trust so that the adoption of AI technology into operations becomes a partnership including everyone from the C-suite to maintenance technicians, as opposed to pushing a new model from the top down, according to Paquette. She said her organization did so by establishing critical roles and key stakeholders to bring along in the journey of scoping and development. “So, you’re initially able to have a bit of validation on your business drivers,” she said.
It's part of what she called the cultural piece of adoption. As an example, she explained that Chevron works with maintenance planning within a business unit on the opportunity to leverage AI to reduce errors and increase overall process efficiency. “People get excited to get a tool that makes their lives easier and reduces a lot of manual entry,” she said.
LaBruyere added that everyone must understand why the organization wants to add AI into the mix. “It’s about understanding what value it brings to the organization,” she said. “I think there's a lot of different sets of expectations when it comes to AI—what it can and can't do, and the lack of knowledge that leads to some misunderstanding and misuse.”
Basu added that the trust factor usually outweighs the adoption of technology in the beginning because generations of process control systems were designed as deterministic solutions, and people are used to rules-based systems. “AI is probabilistic, and no AI solution is born perfect. It needs to be trained,” he said.
It requires a great deal of education for the industry, as well as an upgrade in the quality of data that organizations use to feed AI. “You have to start somewhere, but that somewhere must involve the right and enough data,” Basu added.
Data and technology
Basu pointed out that better data can lead to more trust and buy-in. “If we are able to quickly narrow down to few areas in which we can demonstrate good quality data and what a robust AI engine can deliver with it, they become successful use cases,” he said.
All three panelists agreed that AI is not a panacea for all ills. There are some problems that can be solved without AI, so organizations must be upfront with themselves and methodical to identify specific use cases where AI is the best solution. In particular, AI must be explainable in the control world.
When it comes to technology, adopting AI into process control is a stepladder approach, not a leap. Operators already have a lot of technology available—sensors, distributed control systems, historians, etc. To adopt AI, they should not be required to rip and replace anything but instead build a framework that can leverage all that legacy infrastructure. Basu said Honeywell advocates with its customers to first figure out how to harness all the data, contextualize the information, then run the AI.
“The focus is on bridging the gap between what an operator is doing and what they should be doing,” Basu added.
Managing expectations and rollout
Companies don’t want to invest where they don’t need to invest. So, initially there can be tensions around the subject of adopting AI technology in an organization, Paquette said. “When we talk about convergence, everyone in OT gets excited,” she said, “but there's a hunger to understand how to do it.”
She pointed to large-language models as an initial opportunity to get the AI ball rolling. For Chevron, Paquette said, the company is now going a step further and applying a visualization element.
She also stressed that it’s important to keep the human element in the control loop, so that you can be sure you’re making the correct decision and making it effectively. That being said, the company’s also working to bring data to the forefront so technicians can be even more comfortable with their decisions.
LaBruyere added that companies must be mindful of the feasibility of AI—for example, addressing local regulations or existing standards—and manage expectation for immediate results. “In our industry we’re very consumed with looking immediately at the cost savings. It’s very typical KPIs,” she said. “In the (AI) realm we are starting to look beyond that to next generation of measurements, such as improved workflow and better decision-making.”
The questions for most organizations these days is how far they can take AI now, and what does the future hold?