Just like everywhere else lately, artificial intelligence (AI) was front and center at ARC Advisory Group’s 30th annual Industry Leadership Forum on Feb. 9-12 at Renaissance Orlando Sea World. It consisted of more than 500 attendees from more than 280 companies attended presentations by more than 100 speakers in four tracks.
As usual, the forum kicked off with the 14th annual edition of its traditional cybersecurity sessions organized by Sid Snitkin, ARC’s VP and GM of enterprise services. Some cybersecurity strategies reported on by Snitkin, sponsor Fortinet, and the other speakers included enabling firewalls to perform deeper cyber-threat detections, as well as more granular network segmentation of equipment and other assets, which is called micro-segmentation. They also referenced virtual patching, which consists of addressing vulnerabilities in legacy devices without taking them out of operation, and using further segmentation to shroud them from exposure to possible cyber-attacks.
The cybersecurity presenters also reemphasized that it’s still crucial to gain management’s buy-in and support, maintain cooperation between increasingly converged OT and IT staffs, and conduct ongoing training for personnel. They also reported that digital twins and other simulations can enable cybersecurity initiatives, and added that AI tools like Chat GPT have the potential to help, too, though most of those efforts are still in early, experimental stages.
Cybersecurity panel
In the first panel discussion, Lorena Nunes, OT cybersecurity specialist at petrochemical manufacturer Braskem, reported that machine learning (ML) functions have helped its intrusion detection system (IDS), and that it’s mainly experimented with using ML for auditing and rule changes. “It’s also using large language models (LLM) to find gaps in cybersecurity functions, and examines overall business directives to determine how and where AI might be useful,” said Nunes. “AI also may be able to speed up testing tasks, and allow users to respond faster.”
Dennis Hackney, OT cyber principal at Chevron, added that LLMs typically run at Level 4 and 5 of the Purdue Enterprise Reference Model for industrial control system (ICS) cybersecurity, so his company is using them for business plans and reports. “We also requires that all devices performing AI functions be registered,” said Hackney. “So far, these include some vision inspection components, though AI may be able to help with cybersecurity by enabling compliance mapping.”
In a second cybersecurity discussion, Marty Martin, process control director at Air Liquide, reported the best starting point for secure remote access and cybersecurity assessments is writing down all the requirements that a process, facility or organization needs, listing the actions and tools that a proposed solution should provide, and using the results to evaluate proposals.
Brad Nash, IIT perimeter security at ExxonMobil, added his company already had many hardware and software-based operating systems, but growing demand for remote access is increasing its network connections. To know what’s on its network or trying to access it, Nash reported that ExxonMobil’s security operations center (SoC) documents all links with the outside world, and requires participants to explain why they want to connect, where they’re from and other details.
Jacob Partain, senior cybersecurity and process control engineer at BASF, reported his company uses a compliance-based strategy for cybersecurity risk assessment. In accordance with the IEC/ISA 62443 cybersecurity standard, it examines its operations network infrastructure for vulnerabilities, firewall performance and any backdoors or other issues, and adds appropriate risk-based procedures and devices.
AI expected to lighten data overloads
In the forum’s opening keynote on Feb. 10, Ashin Parikh, strategy and transformation SVP for global supply chain and operations at PepsiCo, reported its farm-to-table operations generate about 1 billion interactions daily, which require it to organize and manage about 15 petabytes of data. Parikh added that AI is assisting his company with this monumental task, making its operations more predictive, and helping it understand and react more quickly to market conditions. The four pillars where PepsiCo is applying AI tools include:
- Functional scope for operations that are compliant, safe, scalable, reusable, responsible and integrated by design.
- Multi-agent framework to help the company progress toward a hybrid, human-and-AI workforce.
- Digital twin 2.0 program that integrates a human- and society-based simulation engine.
- Generative AI (gen AI) infused robotics.
“AI will help transform our distribution centers and warehouse facilities by enabling them to process more data more quickly,” said Parikh. “In fact, we’re already collaborating with Nvidia and Siemens on an effort replicate some of these operations. They expect these digital twins to:
- Using intelligent vision cameras and other devices to find bottlenecks, save time, increase throughput by 19.4%, and avoid $4.5 million to $5 million in expansion costs—for a total 25% reduction in capital costs at its warehouse in Grand Prairie, Texas.
- Increase utilization by 54% optimizing dock operations at its mixing center in Lancaster, Pa.
- Enable the transition from six-door operations to full automation for 55 semitrailers per day at its Arizona-based loading facility, which is also expected to enhance safety, labor efficiency and scalability.
- Improve design and construction of its warehouse and yard in Denver, and validate materials flows from its manufacturing and storage facilities.
In the second keynote, Chase Christensen, VP and CIO for business units and enterprise solutions at Jabil, reported that considering and applying AI must start with the value it can bring to processes and organization—and not just its technology. This is why Jabil established an AI steering committee to reduce obstacles, determine how to invest in generative AI (gen AI) and agentic AI, identify and resolve problems, bring along and augment its users, establish strong data governance, and harvest insights.
“We use AI to deliver real-time data value across our operations. This includes developing an intelligent procurement assistant tool for securing and delivering raw materials for our electronic circuit boards, mechanical and packaging products,” said Christensen. “This assistant complements existing systems using LLMs because it gives users more options for working with their long-term data. For example, it used to be difficult to incorporate physics-based parameters into our mechanical intelligence solution, but AI lets it add schemas and algorithms that integrate this complex information more easily. Because our procurement assistant tool is embedded in existing systems, it puts control back in users’ hands because they’re supposed to be at the point where decisions are made. AI isn’t about finding a place to deploy its technology—it’s about finding problems that AI can help solve.”
These are just summaries of some of the initial sessions at this year’s ARC forum. Many others were equally informative, and will be available at www.arcweb.com and on YouTube.


