Why do many process control technologies fail to make prime time being relegated to special applications that are few and far between? Here I give what I see as the keys to a technology being successful and widely used in plant applications. How and to what degree each of the major technologies achieves these keys is discussed along with what is left on the table.
Keys to Success
(1) Technology addresses the actual dynamics of industrial process applications (open loop steady state or integrating process gain, open loop time constant(s) and deadtime without which I could retire per my last Control Talk Blog)
(2) Technology successfully makes automatic adjustments to process without operator intervention
(3) Technology achieves benefits in terms of an increase in process efficiency and/or capacity
(4) Tools exist to identify the adjustments needed to achieve the best process performance
(5) Technology can be adjusted to perform well for different production rates and process conditions
(6) Operator can understand what the technology is doing to the process
(7) Application can be readily monitored for performance and problems easily identified and fixed
(8) Technology can be applied, maintained and improved by the average process control engineer
(9) Technology is widely applicable
(10) Technology does not distract operator and does not create false information
(11) Technology identifies unsuspected relationships
(12) Technology identifies unsuspected causes and effects
The PID is widely applicable and can be applied by the average process control engineer. Studies have shown that for control of a single process variable, the PID is near optimal in terms of rejecting load disturbances as documented in a study by Bohl and McAvoy in a landmark paper in 1976. This has not stopped thousands of papers and millions of hours being spent on trying to invent a controller to replace the PID for single loop control. If the process has a runaway response (e.g., highly exothermic reactor temperature) or requires very fast execution (e.g., < 1 sec), the PID is the only safe solution.
The PID can be readily applied but it is amazing how many of these controllers are not properly tuned. Part of this is due to the lack of really understanding the modes and options. To confuse users, the integral mode gives the type of response by humans in terms of never being satisfied accentuated by digital displays to excessive resolution and having no anticipation or understanding of the trajectory. The result is most loops have an order of magnitude or more too much integral action (too small of a reset time) except for deadtime dominant loops as discussed in my last Control Talk Blog. Confusing the situation are over a 100 tuning rules with the advocates having a closed mind to the relevance of other tuning rules as discussed in my Control Global Whitepaper “So Many Tuning Rules, So Little Time”. While the average user can apply almost any set of tuning rules to get good performance, a consultant is often useful to get the best performance. Tuning software can automatically identify the actual dynamics for feedback control and provide recommended settings that can be scheduled online for different production rates and process conditions. PID output for startup, batch operation and abnormal conditions can be scheduled automatically by the use of the Track or the Remote Output modes. Cascade control can be effectively used to compensate for fast secondary disturbances and nonlinearities. Advanced Regulatory Control techniques such as feedforward, override and valve position control can quickly implemented to increase process capacity or efficiency. When override control is used, the selected controller is evident to the operator. The external reset feedback feature must be used so that the unselected PID offset in output from the selected PID output is the unselected PID error multiplied by its gain providing predictability as to when an unselected PID will take over control. An enhanced PID can be used to enable the PID to use at-line and even off-line analyzer results for composition control with sensitivity of tuning settings to cycle time eliminated for self-regulating process responses and minimized for integrating process responses as described in the 7/06/2015 Control Talk Blog “Batch and Continuous Control with At-Line and Offline Analyzers Tips”.
While the PID can be configured and tuned to provide automatic control for all sorts of situations resulting in its use in over 99% of the feedback control systems, there is a lack of a straightforward general approach to increase its effective. While my previous blogs this past year supporting my YouTube Videos on PID Options and Solutions give a useful perspective and a lot details, what is needed is for someone to step back and give a clear step by step approach that allows for different objectives and tuning methods. By the way, these videos are now posted as part of a very practical and useful playlist of ISA Mentor Program Webinars that should be significant value to every automation engineer.
There are limitations to PID control that move users to consider Model Predictive Control (MPC). Most tuning software does not effectively and automatically identify the feedforward dynamic compensation needed by a PID. Also, more than half decoupling for a PID is confusing and too challenging for the average user. Compound or complex responses where the initial response is different than the later response due to recycle or competing effects, tuning of the PID is difficult. Additionally the simultaneous predictive honoring of the multiple constraints is not within the normal capabilities of the PID. These situations all lead us to MPC to achieve greater process performance, especially since the MPC software may be integrated into the DCS and implemented by a simple configuration making its use less expensive, faster and easier. Also, let’s face it, doing an article or presentation on a PID application is not going to get you as much recognition as doing one on an MPC application.
Model Predictive Control
MPC can be potentially used as a replacement for any PID control loop in continuous operation if the PID gain needed is less than ten, the execution rate does not need to be faster than 1 second and derivative action is not essential for safe tight control for unmeasured disturbances. The temperature control of many liquid reactors require too high of a PID gain and derivative action to be good candidates for MPC. Here ratio control of reactants (e.g., intelligent flow feedforward control) is often used without dynamic compensation beyond a simple equal filter applied to each reactant flow setpoint as seen in the 7/26/2016 Control Talk Blog "Control Strategies to Improve Process Capacity and Efficiency - Part 3". For highly exothermic reactors, the positive feedback response can create a dangerous runaway condition with a point of no return. While technically processes should be designed so this cannot happen, the acceleration in the response from a batch polymerization reaction can exceed coolant capabilities leading to relief valves popping and blowing over the reactants to a flare system. I have been in a control room when this occurred. Outside of these liquid reactors, there are many potential MPC applications especially when new MPC technology can offer tight control of integrating processes.
MPC can be extended to fed-batch if models are switched as the batch progresses. The control of a temperature profile or product composition profile can be done by the translation of the controlled variable from temperature or concentration to rate of change of temperature and concentration, which is the slope of the batch profile. This creates a pseudo steady state and the ability to readily make changes in the controlled variable in both directions.
The abilities to handle nonlinearities and provide operator understanding are more challenging for MPC than PID control. Consultants and proficient users can address these needs by switching in different models for different production rates and plant conditions and adding graphics that enable operator to see future trajectories and understand the relative contribution of each controlled variable, disturbance variable and constraint variable. For multivariable applications proficiency in knowing and improving matrix condition number and using the optimizer (e.g., linear program) is useful.
My experience personally and after doing 14 years of Control Talk Columns with industry experts, is that the most successful and extensively used technologies are PID and MPC because they address Keys 1-10 with a track record for Keys 1 and 2 far exceeding the remaining technologies. Where PID and MPC fall short is in the ability to find unsuspected relationships and cause and effects particularly when there are a large number of variables at play. The variables for PID and MPC are chosen and classified as to controlled, manipulated, disturbance, constraint and optimization variables based on knowledge of person applying the technology. Also, regimented testing is used to identify the relationships. On the plus side, PID and MPC identify all of the necessary dynamics, whereas the other technologies leave a lot to the imagination. Thus, the PID and MPC solutions are narrowed dramatically compared to the remaining technologies to be discussed when looking at orders of magnitude number of variables with very little preconceptions and regimented testing as proposed in the Industrial Internet of Things (IIoT).
Multivariate Statistical Process Control (Data Analytics)
Data analytics can find relationships between process inputs and process outputs in very large sets of data. Data analytics eliminates relationships between process inputs (cross correlations) and reduces the number of process inputs by the use of principal components constructed that are orthogonal and thus independent of each other in a plot of a process output versus principle components. For two principal components, this is readily seen as a X, Y and Z plot with each axis at a 90 degree angle to the each other axis. The X and Y axis cover the range of values principal components and the Z axis is the process output. The user can drill down into each principal component to see the contribution of each process input. The use of graphics to show this can greatly increase operator understanding. Data analytics excels at identifying unsuspected relationships. For process conditions outside of the data range used in developing the empirical models, linear extrapolation helps prevent bizarre extraneous predictions. Also, there are no humps or bumps that cause a local reversal of process gain and buzzing.
Batch data analytics does not need to identify the process dynamics because all of the process inputs are focused on a process output at a particular part of the batch cycle (e.g., endpoint). This is incredibly liberating. The piecewise linear fit to the batch profile enables batch data analytics to deal with the nonlinearity of the batch response. The results can be used to make mid batch corrections. Potentially batch data analytics can address most of the keys to success listed.
The application of data analytics to continuous processes requires the synchronization of the process inputs with the process outputs predicted. This can be problematic because there are many delays and time constants of instrumentation and control valves and unit operations that are unknown. A deadtime is identified to be applied to each process input to deal with these dynamics. As you can imagine, this may work on a single unit operation with negligible automation system dynamics and no recycle but is a big challenge for most plant applications
A word of caution for data analytics and the next technology discussed Artificial Neural Networks (ANN). What are identified are correlations and not causes and effects. The review by people who understand the process (e.g., process and research engineers) and the possible use of first principle simulations are needed to validate causes and effects.
Both data analytics and ANN have been much more successful in predicting human responses that tend to be much less deterministic and much more situational. There is a possibly a greater future in terms of predicting operator responses than process responses when it comes to continuous unit operations.
ANN offer predictive behavior that addresses nonlinearities much more effectively than data analytics particularly for continuous processes. However, ANN does not necessarily use principle components to prevent cross correlations and to reduce the number of inputs. Consequently, an ANN may have order(s) of magnitude greater number of process inputs than data analytics without the ability to determine numerically the contribution of each input beyond a relative sensitivity analysis. Also, the response may have humps or bumps conditions within the test range and strange results outside of test data range.
Special techniques can be used to address these concerns. The future of ANN to me is more as a supplement rather than a replacement to other technologies. There has been considerable success in the use of a combination of an ANN and MPC. Nothing to me precludes the use Principal Component Analysis (PCA) extensively employed by data analytics to eliminate cross correlations and to dramatically reduce the number of process inputs to ANN. The main problem is that the ANN and data analytics camps see their technology as the best taking an adversarial view rather than trying to get the best out of both.
Fuzzy Logic Control (FLC)
FLC has been successfully used where a process model cannot be obtained (e.g., mining industry). FLC automates the best of the relative operator responses. Given a model, it has been shown that a PID gives as good or better performance for a single loop if all of the PID options are used as needed. I expect the same is true in terms of MPC being a better solution for multivariable control if models can be identified. FLC is undesirable from the standpoint of tuning, understanding and technical support.
I implemented a successful FLC for a neutralization system. The plant was afraid to touch the FLC because no one understood it. Decades later, I showed how an MPC could provide better optimization with a more continuously improvable and maintainable system as documented in the November 2007 Control article “Virtual Control of Real pH”.
Seems like a great idea and maybe there is a future for it but the company I spent my career with had 10 people working on expert systems for ten years with only some limited success. Often the expert systems complained too much and sometimes about the wrong things. Operators saw them as a distraction. False alarms totally subjugated confidence in them. The expert systems fell into disuse and were turned off within a few years after the developer left the area. This was back in the 1990s. My hope is that technology has advanced and addressed these issues. I strongly suggest one check how well they meet Keys 1-12.
The expert system we used also had fuzzy logic. I remember entering a lot of rules and then wondering how did they play together and what was the order of execution. It was very easy maybe too easy to add rules but extremely difficult to analyze the consequences for all the possible conditions.
Expert systems do not have dynamics and are thus not used to predict a process variable but rather an abnormal condition and a possible steady state correction. I think there is a place for them in helping provide better diagnostics based on problems identified by other technologies.
As a closing note, the one New Year’s Resolution that I can guarantee I am going to keep is “I will not keep all my New Year’s Resolutions.” Happy New Year!