Many people are under the false impression that they don’t have to calibrate their digital transmitter, in part because they don’t have the facilities to do so with the accuracy possible at the factory. However, not doing at least a reference check on a regular basis, including at commissioning, is a bad idea. With today’s smart instruments, doing such a check is easier than it ever was before.
An easy example of how to do this with the HART digital signal is to use the PV to verify the 4-20 mA reading at the zero, 50% and full-scale reading input to the transmitter. This will identify what, if any, errors exist along the analog signal circuit (D/A converter in field device, cable, A/D converter in I/O card). Other potential sources of error that can be identified this way include ground loop difference or losses due to cable resistance.
Even though digital devices are inherently more stable than their analog predecessors, their tolerances are much narrower than in the past. In addition, digitizing instruments have analog circuitry — process sensor (for example, capacitance cell, Wheatstone bridge, etc.), preamplifiers, buffers, etc., whose performance can change over time. Therefore, digital devices are not exempt from regular calibrations.
Not calibrating carries its own costs:
- Falsely passing or failing a quality specification has costs. In discrete operations, false passes can send inferior products to customers. False failures end up in the reject bin, ruining yields and prompting costly rework or discards. In process operations, the equivalent of a false failure is product give-away because, to compensate and maintain the minimum specification, additional processing is often required.
- Commerce depends on globally agreed upon standards of weights and measures. Only traceable calibration can ensure adherence to these standards, especially for custody-transfer measurements on which payment is based.
- Contractual requirements may stipulate a regular calibration regimen where the penalty for non-compliance could be fines or loss of business.
- Calibration can reveal an underlying problem that could evolve into a costly failure, thus preventing an expensive unplanned outage.
Being able to make effective use of this information requires a calibration management system to not only assist with the scheduling of the calibration procedure, but also in tracking the results of each calibration in one place, so any changes over time that might be part of a trend that may indicate a larger underlying problem can be identified early. The International Society for Pharmaceutical Engineering’s (ISPE) Good Automated Manufacturing Practice (GAMP) guidelines for manufacturers and users of automated systems in the pharmaceutical industry rely heavily on the traceable calibration documentation of a calibration management system.
Another series of useful documents describing how to evaluate uncertainty in measurement data are the Joint Committee for Guides in Metrology’s (JCGM) documents. “Evaluation of Measurement Data — Guide to the Expression of Uncertainty in Measurement” and “Evaluation of Measurement Data — An Introduction to the Guide to Expression of Uncertainty in Measurement and Related Documents” are two of the five documents available as a Zip file from the Bureau International de Poids et Mesures. Another useful publication, developed by the calibration tool company Beamex, is the Ultimate Calibration Book with approximately 200 pages of information on the hows and whys of calibration. This book can be downloaded here.
Though wireless devices are primarily digital, they still need to be calibrated, and if a wireless infrastructure exists, it’s possible to transmit and receive information to and from the technician in the field or the calibration management system in real time.