1660338380362 Keith Larson1

Pain Relief for Open-Systems Complexity

Aug. 1, 2007
Keith Larson sat in Honeywell Users Group Americas 2007 Symposium presentation held in June. See what he has to say.
May you get what you wish for.” These words of ambiguous portent were how ExxonMobil Chemicals’ Bruce Turrie began to characterize the process control industry’s transition from proprietary platforms to open systems and commercial off-the-shelf (COTS) computer technology. I sat in on Turrie’s though-provoking presentation at the Honeywell Users Group Americas 2007 Symposium in June.

Indeed, open systems have had positive effects, including cheaper and quicker initial installs and greater flexibility and capabilities than their proprietary predecessors, Turrie explained. But those positive aspects have come at a price.
Chief among Turrie’s open-systems complaints is the overall management complexity—and concomitant life-cycle support costs—represented by the proliferation of interdependent hardware, operating systems and applications—all with relatively short and asynchronous life cycles.

“Today we have nine to 12 open systems boxes for every proprietary box they replaced,” Turrie said. Proprietary control systems had a life expectancy of 15 to 25 years, but today applications are upgraded every one to three years, operating systems every three to five and computer hardware every four to six, Turrie estimated. Further, each application is delivered on a somewhat unique platform with a different security model, user interface and engineering tools.

Open systems also have brought with them an expanded list of needed skill sets—adding networking, operating systems management, security domain management, OPC and databases. “And many of these technologies require in-depth knowledge to get them right,” he added.

So, if open systems have introduced so much pain and complexity, what might be done to mitigate the problem?
Turrie’s contention is that companies such as ExxonMobil should consider moving to a central—and perhaps remotely—managed control systems architecture, using new technologies to ultimately decouple hardware, operating system and application upgrade cycles.

Server virtualization, for example, is a well-established technique in the broader IT world by which multiple operating systems and applications can be spread over—and insulated from the hardware vagaries of multiple physical boxes, all running on a relatively simple open-source kernel. With server virtualization, “applications are just files,” Turrie explained, “and we can effectively decouple hardware and software refresh cycles.”

“Are we ready for this in process control?” Turrie asked. “Probably not, but we ought to be thinking about it.

“We need to challenge established practices and continuously evaluate new management and infrastructure technology,” he added, with the goal of separating data itself from configuration information, and configuration information from application functionality. “We need to aggressively seek out new tools with the goal of decoupling systems.”

Extrapolating Turrie’s thought process even farther, I can’t help but think of whether and to what extent automation providers may be able to leverage rapidly evolving software-as-a-service business models. Or, can process automation companies provide their capabilities on a subscription basis, in which functionality is provided, but the software itself isn’t packaged or delivered by traditional means?

Think about Google’s business model. “None of the trapping of the old software industry are present,” noted blogger Tim O’Reilly in a recent treatise. Google’s value is delivered directly over the web, with customers paying—directly or indirectly—for the use of that service, O’Reilly says. “No scheduled software releases, just continuous improvement. No licensing or sale, just usage. No porting to different platforms so that customers can run the software on their own equipment, just a massively scalable collection of commodity PCs running open source operating systems plus homegrown applications and utilities that no one outside the company gets to see.”

While few of us, I think, would advocate or envision moving process control itself onto remote server farms, what if the care and feeding of your resident automation systems was remotely and securely managed, such that no one inside your company had to see them on a routine basis? If control specialists could focus on control and optimization, not managing IT upgrades? Now there’s a value proposition I think many in our industry could appreciate.

About the Author

Keith Larson | Group Publisher

Keith Larson is group publisher responsible for Endeavor Business Media's Industrial Processing group, including Automation World, Chemical Processing, Control, Control Design, Food Processing, Pharma Manufacturing, Plastics Machinery & Manufacturing, Processing and The Journal.

Sponsored Recommendations

Measurement instrumentation for improving hydrogen storage and transport

Hydrogen provides a decarbonization opportunity. Learn more about maximizing the potential of hydrogen.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Learn About: Micro Motion™ 4700 Config I/O Coriolis Transmitter

An Advanced Transmitter that Expands Connectivity

Learn about: Micro Motion G-Series Coriolis Flow and Density Meters

The Micro Motion G-Series is designed to help you access the benefits of Coriolis technology even when available space is limited.