How to survive the oncoming train of technology
Be warned: the light at the end of the tunnel is a train! Control engineers are being dragged back into the world of manufacturing with new technologies that will affect both how and where they work.
To cast your vote, log in
or become a member
. This quick, one-time registration gives you access to members-only site benefits.
By Walt Boyes, Editor-in-Chief
, with Dan Hebert, Senior Technical Editor
Larry O’Brien, research director for process industries, ARC, and Gregory K. McMillan, CONTROL ColumnistTHIRTY YEARS AGO
, we began a quest to completely change the manufacturing process. Along the way we’ve improved quality, reliability, throughput and uptime, and created the most profitable manufacturing sector in history, with nearly all of these productivity gains coming from automation and connecting the plant floor to the enterprise. Congratulations, colleagues, we did it.
Lynn Craig, one of the founders of WBF (the Forum for Automation and Manufacturing Professionals) and a member of the Process Automation Hall of Fame, says that from his years in manufacturing management, “control engineers isolated themselves from the rest of the manufacturing world, and were able to get away with it for a very long time.” But that time is over. “Control engineers are being dragged back into the world of manufacturing,” Craig concludes, “by standards like S88 and S95.”
David Beckman, retired Senior Vice President of Emerson Process Management, talked about the future of manufacturing and the role of process automation professionals in that future in his keynote address at WBF this year.
"The worm has turned," Beckman said. "For years, the trend was that, increasingly, the profits we generate go to financial analysts on Wall Street. Profits have evaporated from the people who do the work."
This is about to change. The drug industry has had a meteoric rise, pulp and paper seems to be rebounding, the oil picture appears that we are in for a long period of increasing oil prices, and every one of these growth sectors is going to require more, not less, engineering. As the first world, and increasingly the third world, nations tighten environmental regulation, this will require even MORE engineering.
But even the big engineer-constructors like Fluor, Bechtel, Jacobs and Foster-Wheeler don't have the expertise to do the projects that are currently on the drawing boards. We tried to save our way to success by cutting staff and reducing training. What is clear is that this is no longer working. Beckman showed a slide of a "new operator," with spiked hair and a tongue pierce, and asked, "What is the potential that they
will be able to take over from us
Engineering graduate demographics in North America and Europe are dropping dramatically. "When was the last time you saw a television show that featured the 'engineer guy' as the hero?" Beckman asked. He quotes Dr. Ken Cooper, adjunct professor at the University of Pittsburgh, who studied a recent graduating class of engineers at UP, and discovered that none of them could successfully define PID. Almost all of them knew that the "P" was for proportional, less than half knew that the "I" stood for integral, and NONE of them knew that the "D" was for derivative.
"Industry has to take a strong hand in convincing academia that there is a crying need for control specialists," Beckman admonished. This is an almost eerie dovetail with a CONTROL editorial ("C'mon Vendors, Let's Step Up!"
CONTROL 2005). If end user companies and vendors aren't careful, there won't be enough engineers to make stuff work.
In the old paradigm, the corporation is the master, we are the slaves. In the new paradigm, the means of production is knowledge
which is owned by the knowledge workers and is highly portable.
In the new paradigm, a large number of employees will become knowledge workers, working under contract as professionals. "Job security," Beckman opined, "is in what you know, not who you work for. I can't tell you all of what the new paradigm will look like," Beckman said. But we can.
The new paradigm is headed toward process automation professionals very fast. As Rich Merritt said in his column in the August issue of CONTROL, “the technologies and techniques I wrote about have hit our industry like a freight train.” That’s right. While the process industries have been reviving and jobs have been increasing, the new paradigm is expanding and the light at the end of the tunnel is a train.The DCS Revolution Started It
This year marks the 30th anniversary of the Distributed Control System, or DCS. The development of the DCS closely mirrors that of process automation itself, moving from proprietary technologies and closed systems to commercial-off-the-shelf (COTS) components, industry standard field networks, and Windows operating systems. But the most important transformation was from a system focus to a focus on business processes and achieving operational excellence (OE) in process plants. The control engineer has been dragged into the world of manufacturing.
The DCS was introduced in 1975 by both Honeywell and Yokogawa, who had each produced their own independently designed products, the TDC2000 and the CENTUM systems. This marked the dawn of the system-centric era of the 1970s. Central to the DCS model was the inclusion of function blocks, which continue to be the fundamental building block of control for DCS suppliers, and fieldbus technologies.
In the 1980s, users began to look at DCSEs as more than just basic process control. Suppliers began to adopt Ethernet-based networks, with their own proprietary protocol layers. The 1980s witnessed the first PLCs integrated into the DCS infrastructure, as companies such as Rockwell, Siemens, Schneider and others entered the DCS market. The 1980s also witnessed the beginning of the Fieldbus Wars, continuing to this day.
From the 1980s and 1990s forward, suppliers discovered that the hardware market was shrinking, commoditizing, and becoming saturated. Larger suppliers found themselves competing with second tier suppliers, including some whose “DCS” emulation software ran completely on COTS products, like Wonderware, Iconics, Citect and others. The key large suppliers began to make the transition to a software and value-added service business model.
Just as it became apparent that the simple application of automation to manufacturing processes was not delivering the return on investment that it once did, suppliers began to offer new applications such as production management, model-based control, real-time optimization, plant asset management and real time performance management tools such as alarm management, predictive maintenance, and many others. In addition, suppliers began to offer the consulting services necessary to make all these new applications work.f
In the 2000s, we’ve seen a dramatic shift from system-centric and technology oriented approaches to a business-centric approach. Users now look at the overall business value proposition, with elements such as asset utilization, return on assets, and increased plant performance coming to the forefront. Technology for technology’s sake is no longer an effective argument for justification of automation products. Again, the control engineer is being dragged into the world of manufacturing.What Time Is On Your Watch?
You can think of what has happened to our profession this way: In the early days, we were all concerned with how to build the watch. Knowing how to build the watch was critical to success.
Later, it became imperative to also know how to tell time. So your skill set expanded. Later yet, we realized that it wasn’t enough to be able to know how to build the watch and tell time. Now we needed to know what the benefits of being on time were, and the consequences of being late. So the skill set expanded once again.
That is where lots of process automation professionals are: we know how sensors work, we know how loops work, and we know how to control a process.
Unfortunately, the required skill set has expanded once again. Now it isn’t enough to be able to build the watch, tell time, know what happens when you’re late, and why you should be on time. Now we have to be experts on scheduling, because everything below that has become transparent.
Our primary value now is seeing to it that information from the plant is transmitted to the enterprise. You see, what has happened to process automation is that it is now working in fourth order concepts. Many of you aren’t there yet, and some of the new technologies aren’t quite there yet either. If you don’t get there, and soon, that train will flatten you like a penny on the track.Disruptive Technologies R Us
We cover these disruptive technologies regularly, so we will just list them here:
- Collaborative engineering
- Integrated simulation and design/draw software
- Remote server applications, XML, B2MML and web services
- HMI and human factors engineering
- Automatic Identification and Data Capture (AIDC) technologies like RFID
- Robust wireless systems for monitoring and control
- Mobile computing
- More and better online analysis systems
- Smart sensors and final control elements
- Real-time performance management
- Real-time asset management
Looking a great deal like George Jetson’s control room, this is a three-dimensional display currently working in an ExxonMobil facility. Source: 3D-Perception AS
Taken together, these disruptive technologies paint a detailed picture of what tomorrow’s plant will be. Designed in close collaboration between the process engineer and the control engineer, the plant processes will first have been simulated (see Sidebar below),
then the simulation data will have been exported to the plant design software and the control design software. Using AIDC and real-time asset management techniques, the plant will have “live-as-builts” that are continuously updated for maintenance and management. This will be true of both “brownfield” and “greenfield” plants, since any older plants that haven’t been upgraded to this level of control will have been abandoned.
The operator will be a supervisor most of the time, reacting to upset of the control system and the plant process (See Figure 1).
Ian Nimmo, human factors guru and president of User Centered Design Services LLC, says, “In a properly run plant, the operator should not have to intervene in the plant operation except in the case of an upset, and any time the operator has to do anything, it is
an upset.”What Should You Know, and When Should You Know It?
Not only is there a shortage of competent process automation professionals, both in the end user ranks and in the ranks of vendors and vendor service groups as Dave Beckman pointed out, there is a significant need for advanced process automation training. WBF, ISA and other organizations have highly developed training courses aimed directly at the process automation professional, many covering the disruptive technologies we’ve identified. Yet, as our July Control Poll
indicates, over 45% of you won’t pay for training that your employer doesn’t provide. It isn’t a good bet that more training facilities will become available if this trend continues. And it is
a good bet that, if you don’t get training on these new trends, you will be out on your ear, regardless of the shortage.
How to Survive the Oncoming Train
“Our Inescapable Data vision suggests that it is not just advances in each of these technologies, it is the combination
of these fundamental elements that will break barriers and magnify gains to levels not yet anticipated. We think these new combinations will lead to an explosion of benefits driving both higher personal and economic satisfaction.” This is from a new book, Inescapable Data: Harnessing the Power of Convergence
out of IBM Press, by Chris Stakutis and John Webster. Stakutis is the IBMer, CTO for emerging storage software. Webster is founder and Senior Analyst for Data Mobility Group, a TLA consultant based in Boston. Their thesis is that the convergence of technologies and ubiquity of data will drive a revolution. Applying this to process automation, we get the vision of tomorrow’s plant we’ve given you. If you want to live long and prosper in this new world of inescapable data, you will need, as Rich Merritt encourages in his column, to “get your mind right about technology,” and start thinking in fourth order terms. Remember, nobody cares how the watch is built, or how time is kept.
What If? Virtual Plant Reality
By Gregory K. McMillan
, CONTROL columnistTHE CHEMICAL
industry has had the software since the turn of the century to create a virtual plant where the actual configuration of DCS resides in a personal computer with a dynamic simulation of the process that has evolved to be a graphically configured process and instrument diagram (P&ID). At the same time, the retirement of experienced engineers and technicians at the plant, the disengagement between the operator and process from the use of advanced control, and world wide competition has created a greater than ever need for training and optimization. Potentially, the virtual plant can go where the real plant hasn’t gone for good or bad reasons and replace myth with solid evidence and engineering principles.
Why are process simulators so far behind flight simulators?
What if a virtual plant offered the ability to explore new operating regions and abnormal conditions and help develop and prototype new control systems while meeting the basic but growing need of keeping operator and process engineer skills fresh? Simulations Behaving Badly
The results in the chemical industry have been spotty at best with some notable successes but with too many disappointments. Dynamic simulations of sufficient fidelity to design control systems and train personnel in process operation often require an investment in software and engineering of more than $100K, so when the behavior of the model is erroneous, you have unhappy customers. A misleading simulation is worse than no simulation.
Unfortunately, so called “high fidelity” process simulators show bizarrely unrealistic dynamic responses and may even develop fatal numeric errors for startup, shutdown, extremely abnormal situations, and instabilities. These process models behave analogous to a flight simulator that is great if the plane is cruising but would be erratic or crash during a takeoff, landing, wind-shear, or rudder problem.
Forget about trying to accurately simulate batch operations and exothermic reactors with simulators designed for continuous steady state operation. Empty vessels, zero flows, non-equilibrium conditions, and process non-self-regulation are not properly addressed.
Even at normal operating conditions, the total loop dead time is often off by 1000% or more. Since the integrated absolute error in most key control loops is proportional to the dead time squared, the simulation gives no inkling of the control problem. The mismatch between the process time constant of the plant and model fares somewhat better but errors of 50% or more are quite common. The process gain is the generally the most accurate, but even then it is not suitable for direct use for tuning or model predictive controller. Why are process simulators so far behind flight simulators?
Flight simulators can focus on the servo response of hydraulic controls for well-defined components in air or the ions in space, whereas chemical processes have significant dead times, pneumatic actuators, and thousands of poorly defined compounds. The physical properties (e.g., density, mass heat capacity, and boiling point) as a function of composition, pressure, and temperature of mixtures are missing. Often hypothetical compounds must be created and the simulation obligated to estimate the relationships. Just try to find hydrochloric acid and sodium hydroxide in a physical property package. If you combine this data problem with the uncertainties, disturbances, nonlinearities and slowness of the process, control valves and sensors, you are set up for a failure. (Click the Download Now button at the bottom of this article for a .pdf version of this chart.)
Chemical processes have significant dead times and thousands of poorly defined compounds.
IF YOU LOOK
at the block diagram in Figure 1 of a control loop (above), good physical property data and a “high fidelity” simulator can yield an accurate process output for a process input at steady state. When process simulators are developed by process engineers for process design, this is the beginning and end of quest. You have a design point and the information you need for a process flow diagram (PFD).
However, control engineers want to know the time response of the process variables to changes in controller outputs and disturbances. The first order plus dead time response for the loop in manual (open loop) is the simplest way of representing the dynamic response and can be used for the tuning of PID controllers or the set up of model predictive control. The overall open loop gain is the product of the valve, process and measurement gains; the total loop dead time is the sum of the pure dead times (time delay) and small time constants (time lags); and the open loop time constant is the largest time constant in the loop, wherever it exists.
Process simulators typically don’t address mixing and transportation in delays in the response of the process and the sensor. They also don’t include lags or delays for different types of sensors or actuators, or the installed characteristic, deadband (backlash), and resolution (stick-slip) for various control valve and positioner designs. The thermal lags of coils and jackets may not even be simulated and non-equilibrium conditions are ignored. In other words, the parameters most “high fidelity” process simulators focus on are the gain and primary time constant of a volume when the continuous process is up to speed.
Process simulators may not even get the process gain right because the manipulated variable used by the process engineer might be a heating or cooling rate, which neglects the dynamics associated with utility system design and interactions. The vapor is also assumed to be instantaneously in equilibrium with the liquid by means of a “flash” calculation. A dynamic process simulator often shows the temperature response of a perfectly mixed volume to an instantaneous change in heat input or removal rate for equilibrium conditions, which for columns and evaporators is after the mixture reaches its boiling point. Wouldn’t it be grand if real plants started up or responded this rapidly and ideally? Wouldn’t it be super if the severe discontinuities of split range valves and the lags and upsets from fighting utility streams were just a bad dream?
Control engineers who understand the importance of dead time and loop dynamics need to take the lead in the development of the models in dynamic simulations. A new definition of model fidelity is needed that emphasizes the dynamic response of a change in process variables to changes in manipulated or disturbance variables. It should cover unusual and non steady operating conditions during startup and batch sequencing, even though this may mean a sacrifice in the match between a given process input and output at design conditions. If we define and seek fidelity in terms of what a control system and operator sees and has to deal with, the promise of the virtual plant can be fulfilled.
(Click the Download Now button below for a .pdf version of the chart referred to in this article.)
To cast your vote, log in
or become a member
. This quick, one-time registration gives you access to members-only site benefits.
Link to this article from your web site or blog
Interested in linking to "How to survive the oncoming train of technology"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.