Economic value of better control

Part two of this three-part series reveals control technology concepts to reduce process variability and increase profitability
Feb. 3, 2026
5 min read

Key Highlights

  • Variability reduction can eliminate cavitation without hardware changes.
  • Not all variability should be controlled—some should be eliminated at the source.

When process liquid flows through devices with restrictions (thermowells, orifices, valves and pumps), the fluid accelerates. When exiting the device, its flow rate returns to normal velocity. The Bernoulli effect means the locally higher velocity lowers the fluid pressure. 

Cavitation occurs when liquid pressure falls below vapor pressure or degassing pressure, temporarily causing the fluid to flash boil or degas. When exiting the device, velocity reduces and pressure increases, causing bubbles to collapse. Liquid on either side of the collapsing bubble propagates shock waves  that can damage equipment. Higher production rates mean higher fluid velocity, which could lead to cavitation. Though it’s not normally an issue, cavitation can limit throughput. Operating with cavitation is possible, but it reduces equipment life. 

Cavitation can be eliminated by operating at higher pressure, which is a control decision such as increasing the level setpoint in a tank, or a design solution such as placing the device in an expander-contractor assembly. Cavitation also can be eliminated by operating at lower temperature, or with improved control that reduces variation in throughput, pressure or temperature. Whatever the solution, the increased production rate or extended equipment life economically justifies the process or setpoint change. 

Sensor reliability and location

Control systems rely on measurements. If the sensor tends to fail, lose calibration, has noise or poor resolution, or is in a place that causes a measurement delay (deadtime), then control will be degraded. Alternate sensors or locations or inferential measurements can solve those problems.  

Blending

Increased blending of material in a process reduces the impact of temperature or composition variation that comes from disturbances or feedstock variation. Blending can be used in preparing feedstocks, inline mixing, longer or larger process lines, increased tank volume or more effective tank agitation. An ideal analysis is related to quantity of material being mixed, and the standard deviation of the mixed product composition scales with the inverse of the quantity. If volume or mass is doubled, the standard deviation of the process variable (PV) variation is halved.  

Eliminate assignable causes

An assignable cause is an occasional external event that upsets the process. It may be a sudden rainstorm that rapidly cools equipment, an occasional raw material batch that has an impurity, a measurement sensor failure, an electrical circuit trip or many other singular events. It’s not a continual influence on the process. 

Statistical control charts can reveal when such an event creates an unusual deviation from normal process variation. Structured procedures, such as  Six Sigma or statisitical process control (SPC), can organize the search for a culprit event. Once identified, process or management procedures can be changed to either eliminate such events, or eliminate their impact on the process. The term “assignable cause” means the source of the upset might not be known, but it has a statistically real impact on the process, and it can be identified.

Each time the impact of an assignable cause is eliminated, process variance is improved, along with associated benefits.

Calculation challenges

The concepts illustrated in Figures 1 and 2 of Part 1 and in traditional statistical control charts assume the classical Gaussian variation (normally distributed variation) in a process variable. However, nonlinearity in a process, interactions or persistence of disturbances may make PV variation non-Gaussian. In this case, classic metrics of variation (variance and standard deviation), classic statistical procedures (T- or F-test, ANOVA), or classic linear regression may be invalid. 

Get your subscription to Control's tri-weekly newsletter.

If you can determine the pattern of the deviations from setpoint, and the reduction in the distribution due to improved control, then you can determine the economic benefits of control improvements in the process, which permit operating closer to constraints. 

A classic heuristic rule is that each advance in control strategy halves the PV variation, but it may or may not. So, the question is, “How to assess the impact of control improvements on variability and from that the impact on process economics?” One answer is to use process simulation. 

Process simulation

Process dynamic simulators, which are grounded in accurate models and validated with process data, are a systematic way to calculate improved control benefits. In modern parlance, a dynamic simulator that’s a surrogate for the process is called a digital twin. Adding environmental effects to the simulator (noise, drift, stiction, resolution, etc) will make the simulation representative of what nature will give you. 

A simulation including natural vagaries is a stochastic simulation. By contrast, most simulations are deterministic. When your process is operating, nature doesn’t keep the inlet humidity, fuel BTU content, ambient losses or catalyst reactivity constant. Nature contrives mechanisms that add noise to measurements. 

Simple models for generating noise and disturbances can be found in the Control series “Adding realism to dynamic simulation for control testing,” part one and part two, and the article, “Nonlinear model-based control: using first-principles models in process control, published by the International Society of Automation (ISA).

Calibrate your digital twin and its input disturbances, so the simulator matches the variation you currently have in your process. 

From extended time simulation, measure the frequency and magnitude of specification violations, waste generation, on-constraint events, and consumption of material and utilities. Change the simulator to represent control improvements that you’re considering, and run it for an extended time to reveal the new CV and PV distributions. This will allow you to explore setpoint changes that will reduce operating expenses and/or improve throughput. Run it with the new setpoints to assess variations and improvements in quality, throughput, etc.

Use the results of the simulated current operation that match current process values to establish credibility of the simulator with your customers, then use the before-and-after results to credibly reveal the economic impact of your proposed control improvement.

About the Author

R. Russell Rhinehart

Columnist

Russ Rhinehart started his career in the process industry. After 13 years and rising to engineering supervision, he transitioned to a 31-year academic career. Now “retired," he returns to coaching professionals through books, articles, short courses, and postings to his website at www.r3eda.com.

Sign up for our eNewsletters
Get the latest news and updates