Reimagining optimizing industrial operations
Key Highlights
- Emerson's Enterprise Operations Platform integrates operations across field, edge and cloud to enable enterprise-wide visibility and autonomous decision-making.
- The acquisition of AspenTech enhances Emerson’s capabilities in industrial AI, process design and digital twin-based optimization, supporting a shift from control-centric to data-centric automation.
- A unifying data fabric consolidates fragmented operational data, ensuring accuracy and enabling AI applications to optimize across multiple conflicting goals like production, reliability and sustainability.
With a push toward artificial intelligence (AI), unified data fabrics and edge-to-cloud architectures, Emerson is working to create intelligent, self-optimizing enterprises powered by contextualized data and predictive ability. To find out more, Control talked to Peter Zornio, chief technology officer at Emerson, about the future of industrial automation.
Q: What does Emerson envision as the future of automation, given the push to AI, data and the recent acquisition of AspenTech?
A: Emerson’s vision for an industrial automation platform integrates operations across the intelligent field, edge and cloud to empower enterprise-wide visibility, optimization and autonomous operations. We call it the Enterprise Operations Platform—a unified suite of software across those computing domains.
We believe we're in a unique position to deliver this given our unparalleled technology stack from field devices to optimization software. A big step in building this capability was our recent acquisition of AspenTech, with its deep software expertise as a leader in industrial AI, bringing decades of experience in process design and digital twin-based optimization to Emerson’s automation portfolio.
This direction represents a fundamental shift in how we approach automation, moving from a rigid, control-centric perspective to a more open, modular and data-centric model at the core of that unified suite. It’s designed to unlock siloed data, and enable information to move freely across those operating domains of production, reliability, safety and sustainability, much more than control.
The goal is to help customers use and manage both new and existing rule-based and AI applications, empowering enterprise-wide visibility, optimization and driving more autonomous operations.
Q: What must happen for a fully autonomous manufacturing plant to become a reality?
A: For some facilities, it’s not nearly as far off as you might think. The concept of self-optimizing plants and assets is nothing new. Simpler, more standardized facilities such as air separation processes and offshore platforms are unmanned today. Arguably, the industry has been on the path to full autonomy since digital transformation became the governing paradigm years ago, and industry leaders are implementing some aspects of it as we speak.
What autonomous operations require are self-adapting, self-learning and self-sustaining software and process control technologies that work together to anticipate future conditions, and act accordingly, adjusting operations within the context of the enterprise. Predictive reliability is critical. You must avert potential production interruptions before they occur. I believe this is the biggest obstacle to true autonomous operations for more complex facilities, and has the biggest upside potential from the application of AI.
The transition to autonomous operations can't be about replacing what already exists—it must be about modernizing in place. As mentioned, pervasive, real-time data access—a “data fabric”—is critical for the next steps, especially the ability to combine equipment and process data to improve reliability prognostication and production impact. Sustainability data is necessary as most manufacturers have made commitments in that area; it must be accurately measured, tracked and figured into the overall optimization equation.
Better OT/IT information convergence will enable integrated business processes, reducing the gap between planned and actual performance. Key functions, such as planning and scheduling, can become more closely integrated and aligned with closed-loop automation systems, such as advanced process control and dynamic optimization. By incorporating the best insights from engineering, maintenance and supply chains, companies can gain the holistic view needed to achieve even higher levels of performance.
Each step on the path to autonomy will create incremental value along the way, as companies target AI-powered technology to address specific business needs, and empower their workforce in new ways. Complex, changing processes may never meet the nirvana of no human intervention; the level of effort necessary for that may not provide a justifiable ROI vs. having a small staff. But that staff can very likely be remote at that point.
Get your subscription to Control's tri-weekly newsletter.
Q: You’ve said data and the “unifying data fabric” are at the core of future industrial automation. Can you explain?
A: Manufacturing facilities today have lots of operational data. Core production data from automation systems is the heart of a facility's operational data; but reliability, safety, quality, energy, planning, inventory, emissions—many functions have their own sets of data repositories and the applications that use them. Those all grew up independently, driven by the functional organizations responsible for them. Different names, different sets of data, different perspectives for the same pieces of equipment or function exist in those different systems simultaneously. To truly optimize across all domains, all this data needs to enter the optimization calculation. Today, we build interfaces between these functional systems as needed for a particular application, creating a hodgepodge of hard-to-maintain, specialized connections.
A better approach is a unifying data fabric—a data infrastructure that provides uniform access to these fragmented sets of data under a common contextual model. Data that enters the fabric is confirmed to be accurate and reliable—for all uses. This enables AI and any applications to be much more easily developed and deployed; today; a large portion of the effort in deploying AI is making those connections and validating the data you bring into the AI application.
The more disparate types of data brought into the data fabric, the more AI can ascertain the correlations between them. Then, facilities can be fully optimized across what can be sometimes conflicting goals, such as production output vs. reliability vs. sustainability. What were previously undetected, unintended consequences of actions become known, and can figure into the calculations of autonomous agents.
For Emerson, this data fabric becomes the unifying infrastructure across our suite of offerings. This means products in the Emerson suite are inherently integrated with each other, eliminating expensive integration projects and the ongoing cost of maintaining custom connections. Security is inherent and uniform across the suite. Building the data fabric at an existing facility the first time isn’t a trivial undertaking; but it’s the critical piece of infrastructure to unlock the future benefits we’ve discussed.
Q: The more connectivity that’s built into enterprise data systems, the more critical cybersecurity becomes. How does Emerson plan to address such a fundamental issue?
A: Traditional automation security solutions typically restrict data access in the plant environment, and to specific functional systems and organizations. As we’ve discussed, the free flow of “democratized” data across domains is necessary for true optimization and autonomy.
This is where we're counting on zero-trust principles to enhance security. Zero-trust assumes the network has been or will be compromised, and that no user or asset can be implicitly trusted. It means that security needs to come down to the application level, not just the physical network/system level. This would enable applications to be deployed at any level in the architecture, while ensuring security. It will be a journey to get there, and today’s segregation techniques will continue to play a role in security for quite a while. Careful integration of old and new communications protocols will be essential to ensure they operate together with confidence, and pave the way to building an open yet secure architecture based on flexible technology and intuitive integration.
Q: Data, AI, security—that’s a lot of technology. Is this the answer for digital transformation?
A: It’s certainly a large part, but of course, it’s not the whole answer. Clear business goals, change leaders, work process change that sticks, cross-department collaboration, long term commitment—all those “non-technical” aspects are often the make-or-break ones.