Smart Calibration

How Can You Make Temperature Calibration Faster?

By Greg McMillan, Stan Weiner

Share Print Related RSS
Page 1 of 2 « Prev 1 | 2 View on one page

Greg: I spent most of my first seven years at Monsanto in the field calibrating and commissioning automation systems. The procedure and documentation was ad hoc and manual. Fortunately, the performance of the instrumentation and calibrators has greatly improved. Here we will look at the automation and integration of calibration systems.

Often when an operational problem occurs, the first question is what maintenance should be performed or was performed, as noted in the May 2010 Control Talk column "Drowning in Data, Starving for Information, 4" Knowledge of instrument performance can no longer afford to reside in the maintenance shop.

Stan: Ned Espy, technical director at Beamex Inc. has offered to bring us up to date on calibration system achievements and opportunities. Nick, what can you generalize about the state of calibration in the field?

Ned: Calibration efforts are different at every plant, but there's a common need to isolate a transmitter and provide a calibration test. Sixty percent to 80% of the instruments being calibrated in the field are used for either pressure or temperature measurement. The goal of field calibration is to reduce the need to bring instruments back to the shop, and to eliminate the use of clipboards and laptops to record data. Automated documenting calibrators in the field generate procedures and record results much faster and more accurately and consistently than is possible with manual systems. The pharmaceutical producers have the greatest documentation requirements followed by the power generation and pulp and paper industries. The chemical industry is all over the place in terms of its need for documentation.

Greg: In control rooms, we have seen the tremendous benefits of eliminating operator actions. Even if the automation is not the best to start with, it can be improved because the consistency of actions and data lends itself to analysis and improvement via the data historian. The first step for optimization is automation.

Ned: We also see similar opportunities for automated and integrated calibration systems. The time interval between the calibrations of non-critical instruments can be extended or changed from "scheduled calibration" to "calibration on demand." For example, the slope of the documented drift can be used to predict when the offset will exceed the application requirement. The evaluation of the degree and type of calibration adjustments can lead to the selection and justification of better instrument technologies. The use of these better technologies can greatly extend calibration intervals, but data analysis is needed to confirm and implement the opportunity. Additionally, calibration instructions are generated by the calibrator, leading to faster and more consistent procedures that can be improved with experience.

Stan: How or where do you find the better technologies and the application requirements?

Ned: A lot of the knowledge of what works best and what are reasonable expectations reside in the head of the experienced maintenance engineer or technician. The tapping and retention of this expertise is difficult at best.
Process engineers often have unreasonable accuracy expectations and no concept of what is practically achievable. The control engineer may have a better understanding of the control system capabilities, but less of an understanding of the process requirements.

You hear the statement "it works better over there" with no idea as to whether that is due to calibration, installation or operating conditions. The integration and historization of the data can lead to the sharing of knowledge, building on the capabilities of the professions. We help the users become metrologists, realizing fundamentals such as a calibrator should be at least four times more accurate than the sensor. The savings from more efficient calibration and paperwork reduction has been significant.

Traceability is critical for the pharmaceutical industry, as discussed in Control's March 2012 article, "The Search for the Asset Management Holy Grail, Part II."

Stan: The Rule of Four strikes again. We want the threshold sensitivity and resolution of the measurement to be less than one-fourth of the control band (allowable control error). Can your calibrators measure sensitivity?

Ned: The calibrators have digital displays with four decimal places, so a test could be done to determine the smallest detectable change in measurand. The resolution of the display sometimes annoys customers because of fluctuations in the last two digits. Damping can be added to screen out noise.

Greg: I can understand the users may get distracted and in some cases obsessed with negligible changes. I can see where it might be useful in some cases to get an idea of how much noise in a measurement originates in the sensor. Fortunately, temperature sensors rarely exhibit noise, except due to electromagnetic interference of thermocouple extension wires.

Page 1 of 2 « Prev 1 | 2 View on one page
Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments