Cascade, Scan Time, PID Tuning

When Should You "Slow Down" a PID Loop by Making the Loop Update Time Such That the Loop Executes Less Often?

By Bela Liptak

Share Print Related RSS
Page 3 of 4 1 | 2 | 3 | 4 View on one page

Some signal delays can be anticipated by placing a signal sensor closer to the source of disturbance and computing the arrival of the disturbance at the control loop using a known or measured disturbance velocity. The signal so derived enters the control scheme via a feed-forward path and permits performance less hampered by dead time in at least one part of the control scheme.

There is no way to compensate for sampling time delay within the control loop, other than to reduce the sampling time by increasing the sampling frequency.

In process control loops the sampling time of the controller should never be an issue. It should be less than 1% of the characteristic time of the control loop.  When it is more the wrong controller is in use and should be replaced by a more suitable device. The last thing you want is a controller that makes things worst. Instruments earn their keep by making things better.

Otto Muller-Girard, PE
Omg@frontiernet.net

A: It is a good idea to slow down the scan time for slow moving controllers. When the reset is more than five minutes, in many cases the controller does not need to run every second or even every 10 seconds. A common rule of thumb is that the scan time for a controller should be at least 10 times faster than the reset in minutes/repeat or rate in minutes. In my experience, the scan time almost never has to be more than 30 times faster than the reset.

You have a choice to adjust the scan time or the PV filter time. When the scan time is in a 10:30 ratio to the reset, then the PV filter time can be kept in a more typical range.

For example, if the reset is 10 minutes/repeat, then 30 times faster translates to a 20- second scan time as the maximum, and 10 times faster equates to a 1-minute scan time as a minimum.

Jim Becker
james.becker@bayer.com

A: Scan should be fast enough to capture pertinent information…
If too long, dead time will be increased and loop performance will degrade; you will have to detune the loop because dead time appears now longer. If too short, it is useless… and requires too much time from the CPU.

As a rule of thumb, for control loops tuned to reject disturbances, scan time should be around one-tenth of dead time. That being said, if the loop is tuned sluggishly, there is no need to use fast scan time.

Michel Ruel, P.Eng.
AMichel.Ruel@bba.ca

A: Increasing scan rate may improve stability; it won't hurt. It all depends on the time constants. Detuning will always reduce quality of control. And if dead time dominates the control loop, you must detune to improve stability.

Be careful to tune the process, not the valve.

The old rule of thumb said that when you used sampled data you had to sample five or ten times faster than the loop time constant to "keep in touch" with the situation. Lengthening out of sample rate can be a bad thing, but there is no magic number. We fear what is called "aliasing," where sampling time was some integral fraction of the sampled process time constant, and the data seen was in serious error, as the samples would see the peaks or valleys of cycles and not show the real response for a while and then drift between the peaks and valleys and present a very confused image of the process dynamics. The cure is to sample frequently enough.

Back to master/slave loops. Forgive me if this is elementary for you, but I want to go back to the basics for fear we could be thinking of different things.

I once worked in a plant where the internal temperature of jacketed kettles had to be closely controlled. Any change in the source of heating or cooling (steam and cold water) would affect the kettle temperature, and the system would never settle down. The slave loop has the duty of quickly stabilizing the source of heat or cooling and insulating the slow master controller from the faster upsets. It is usually far faster than the master loop (tens of seconds versus tens of minutes) and should be tuned for fast response. The master (vessel temperature) controller output signal, is used as the set point for the secondary, (jacket inlet temperature) loop. The sampling rate for the master controller has to be frequent enough to maintain good control.

"Detuning" in my world means reducing gain and lengthening reset time (integral) to decrease response. The penalty is obviously sluggish response and poor control. But the charts or display might look nice.

Some time ago while developing the ISA Standard 75.25 on control valve response, I worked up a computer simulation of a simple level control loop using a less-than-perfect control valve, while subjecting the loop to a forced upset. For each valve I made a number of runs, starting with the controller set at a low gain and then increased the controller to a high gain. Integrated error was computed for each run. This was repeated for another less perfect valve and so on.

The 3D plot of process error plotted against valve response and controller gain was very interesting. The difference in total error for the perfect valve versus the less than perfect valve was huge. For each valve, the minimum in the error curve showed where the controller gain was optimum for the loop for that valve. These error minimums varied a great deal. The increased error resulted from the requirement to reduce controller gain to stop cycling. A few percent difference in valve precision had a much greater impact on the quality of control than you might expect. This required a patient computer.

Page 3 of 4 1 | 2 | 3 | 4 View on one page
Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments