This article was printed in CONTROL's June 2009 edition.
Greg McMillan and Stan Weiner bring their wits and more than 66 years of process control experience to bear on your questions, comments, and problems.
Write to them at [email protected].
Stan: In the old days, the deadband of mechanical components and the transmission lag of pneumatic devices added time delays to loops that would be unacceptable by today's standards. Fortunately, we didn't have auto tuners and data historians back then, so we didn't realize the pathetic response and repeatability of these measurements. The Bourdon tubes, bellows and linkages of these measurements belong in a really scary exhibit along with pictures of us with big mustaches and bellbottoms.
Greg: Now we have smart microprocessor-based measurements, such as Coriolis mass flowmeters, coplanar pressure and differential pressure transmitters, and radar level, whose installed accuracy is 50 times better than what we worked with in the 1960s and 1970s. Their accuracy approaches the uncertainty in calibration standards. The drift is so small the average instrument engineer will have changed jobs or retired before the device needs a calibration check. The response is so fast the measurement time constant is too small to be measured in a DCS. The decision to use these instruments is a no-brainer if one considers the reduction in lifecycle costs, the elimination of wild goose hunts, and the possible process control improvement and optimization opportunities.
Stan: The way I insured that a project's budget wouldn't prevent me from doing what was best was to spend what was estimated, and I was able to use the state of the art particularly in flow measurements. The main sources of uncertainties, maintenance headaches and poor response are sensing lines (impulse lines) and sample lines. I was on a mission to go with in-line measurements.
Greg: I recently reread a 1992 InTech article, "Gas-Purged DP Transmitters for Liquid Level and Flow," that gave technical details on how to keep sensing lines at a known composition and state. Purges are designed to keep sensing lines from plugging, trapping process fluids, or collecting vapors or condensation. Liquid purges can adversely affect the process composition, so the use of gas purges was studied. It was found that a sudden increase in process pressure can cause the process to backfill the sensing line or dip tube with process fluid. The result is a compression of the gas in the line or tube and a bump in measurement indication. The pressure and flow or level indication does not return to normal until the gas purge flushes out the process fluid. The solution would seem to be to just crank up the purge rate, but a high purge rate can create an appreciable pressure drop in the sensing lines and increase the cost of the purge gas, which in many cases is nitrogen, and the cost of the vent system in large projects. The size and duration of the bump in the measurement indication makes the 200-msec response time and 0.02% repeatability of the differential pressure transmitter a wasted opportunity.
Stan: We've managed to take 21st-century technology, and degraded the performance to 1970s and 1980s levels using old installation practices.
Greg: RTD sensors have a repeatability of 0.1 °C and a drift of less than 0.04 °C per year, yet you see installations with insufficient immersion lengths that are measuring jacket or ambient temperature. The most flagrant example is the temperature measurement of structured packing in columns. Nearly all distillation columns, strippers and absorbers built in the last 10 years use structured packing instead of trays. In the good old days of trays, we had plenty of room for an immersion length at least 10 times the diameter, and a rule of thumb to insure the conduction error from heat loss from the tip along the thermowell wall to the flange was negligible. The new packing is as thin as foil, and an attempt drill a hole for a thermowell creates a jumbled mess. Bare RTD sensing elements are used to reduce thermal conduction error, but since their tips don't extend past the interior wall, you're most likely measuring nozzle temperature rather than process temperature. I'd expect a lot of the separation efficiency gain from going to packing is lost from monitoring and controlling unrepresentative temperatures.
Stan: The primary time constant of a bare RTD element varies from about 4 to 10 seconds, depending on manufacturer and model number. The time constant of an RTD in a thermowell varies from 25 to 100 seconds, depending on fit and fluid velocity. For a tight fit (0.01-inch clearance between the element and the inside diameter of the well), the limitation to response speed is the process's convective heat transfer coefficient that depends on the fluid velocity. Thermocouples are said to be faster, but once installed in the thermowell, the difference is masked.
Greg: In an analogous fashion, the damping setting on a transmitter or the signal filter in a DCS should be set just large enough to keep the controller output fluctuations within the resolution limit of the control valve, so it doesn't react to noise.
10.Your mom called and she has an idea on how to align the sensor data with the lab grab sample, and it involves day old toast, carrier pigeons and a hose. It’s the best solution you’ve heard so far.
9. Your DCS historian is humming right along.
8. Every model you generate is different than the one before, even when you don’t change anything.
7. Your statistic for gauging model effectiveness is no more reliable than an intestinal emission in a wind storm.
6. QA values for six consecutive batches are identical.
5. Your team lead says, “All we need to do now is generate models.”
4. Although your presentation to the project team about collecting and aligning the data includes the analogies of herding cats, pushing rope and peeing in the wind, there are still no questions.
3. The director of the Colorado State Correctional Facilities just called to let you know the QA data for batch #4567723 is scratched on the wall in cell block D. You’re elated!
2. Your four-year-old stays up at night comforting you when you wake in a cold sweat screaming “The data! The data! We must get THE DATA!!”
1. Your data utility indicates a successful extraction by crashing.
I just need to make sure I don't lose the signal. If I'd lost most of my hearing at Grateful Dead concerts, I'd have an excuse. I feel I'm becoming more like my older friends, who are no longer listening. Stan, if I ever get to this point, just yell in my ear, "Be here now!" especially if you're talking about your pool in Naples.
Stan: One thing I don't do is drive slower. Zipping around big ol' Buicks in my Miata is my way of having fun while minimizing transportation delays.
Greg: Talking about transportation delays, the time it takes a sample to get through and be processed by a sample system and an at-line analyzer, which is often like a little chemical plant, creates problems that go well beyond the horrendous dead time. The lack of intermediate values creates a stepped response that creates havoc with PID response, which is expecting a continuous measurement. The wireless PID algorithm discussed in "Unlocking the Secret Profiles of Batch Reactors" (July '08) and "Is Wireless Process Control Ready for Prime Time" (May '09) shows promise for dealing with the stepped response. However, you still have the whole reliability, maintenance and expertise issue of at-line analyzers. Strangely enough, the compositions and QA data from labs and raw material delivery sheets still mostly reside in spreadsheets and lab systems. Maybe 30 years after the appearance of the DCS, we can finally get process compositions and QA data into the data historian when the lab sample was taken. Just think what we could do with data analytics and statistical models to diagnose problems and predict compositions. Now for a timely contribution from our favorite and only source of insightful top 10 lists, Randy Reiss.