Methods to assess whether a change warrants correction via controls

Aug. 30, 2017
Often, a human wants to fix something, but without understanding cause-and-effect mechanisms or statistical vagaries in a signal, the human tweaking to improve just increases variation

Often, a human wants to fix something, but without understanding cause-and-effect mechanisms or statistical vagaries in a signal, the human tweaking to improve just increases variation. “Tampering” is the term W. Edwards Deming applied to what I’ll describe as “messing with it, and making it worse.” Deming’s funnel experiment is easily visualized: Hold a funnel a few inches above a target mark on a table. Now, one-by-one, toss marbles into the funnel. They exit the bottom with random angle and spin, fall on the table, and roll to rest. The marbles surround the target in a circular pattern on the table. The size of the pattern is a response to the vagaries of their path through the funnel. The process has natural variation.

Now, let’s add control. Toss a marble. Let it come to rest. If it lands three inches to the southeast of the funnel, then the response might be to move the funnel three inches to the northwest. If the funnel remains three inches northwest, the circular pattern will be the same, but just shifted.

However, if with each marble toss, the funnel is moved to compensate for its deviation from target, the control action of moving the funnel will enlarge the circle of resting marbles. The lesson is: when variation is independent, control response to natural variation will increase the variation.

Statistical process control (SPC) tempers changes to prevent tampering and permit control action only when there's statistical evidence that there's been a real change, an assignable cause. SPC has evolved into Six Sigma, and the concepts form the basis of quality methods.

The tempering concept can be applied to automatic control. Consider orifice flowrate measurement and control. If the valve doesn't change position, and the flowrate is unchanged, the orifice will be affected by random vagaries in the fluid turbulence, and there will be variation in the measurement, perhaps 2% of full scale. If the valve were opened a bit more, the flow rate would be a bit higher, the average would be increased, but the variation would be the same. If the valve is opened and closed in response to statistical vagaries on the measurement, then the true flow rate rises and falls, perhaps by 1% of full scale. And with the sensor variation, the measured flow rate will vary 3% of full scale, a 50% increase.

If you adjust a batch recipe after each batch, seeking to make the key performance metric meet a target, then likely, you're adding variation. Even if the measurement is noiseless, the outcome of each batch will be subject to the vagaries of material and processing that led to the outcome. If the vagaries are independent, correction that would have been good for the last batch will just compound error in the next independent batch.

If you're adjusting model coefficient values from measured data, then even if the process is not changing, variation in the data will make it appear that the model coefficient values are changing. These coefficients could be related to heat exchanger fouling, catalyst reactivity, pseudo-component composition, diffusivity, etc. Changing the model in response to signal vagaries will change resulting process setpoints or control signals, shift the process, and increase variation.

Algorithmic SPC is the application of SPC to automated procedures. Have an SPC module watch the output of a controller, or batch recipe fixer, or model coefficient calculator. When the output isn't indicating a statistically significant change, don’t pass through the change.

There are many ways to do this, but to be compatible with online devices, we need a computationally simple method. I like using a cumulative sum (CUSUM) method, call it an SPC filter, and freely offer a mathematical explanation and demonstration software (open code) on my website. Mouse over the Techniques menu, then click on Statistical Filters.

Lags in dynamics also often provide tempering that masks the impact of control on increasing variability. The applicability is best at filtering noisy signals, batch-to-batch recipe or model adjustment, setpoint adjustment, or feedback control action on processes with a large delay, such as that due to chemical analysis.

About the Author

R. Russell Rhinehart | Columnist

Russ Rhinehart started his career in the process industry. After 13 years and rising to engineering supervision, he transitioned to a 31-year academic career. Now “retired," he returns to coaching professionals through books, articles, short courses, and postings to his website at

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...