Ensuring an Accurate Result in an Analytical Instrumentation System Part 2: Calibrating the Analyzer

Overview:

In many analytical instrumentation systems, the analyzer does not provide an absolute measurement. Rather, it provides a relative response based on settings established during calibration, which is a critical process subject to significant error. To calibrate an analyzer, a calibration fluid of known contents and quantities is passed through the analyzer, producing measurements of component concentration. If these measurements are not consistent with the known quantities in the calibration fluid, the analyzer is adjusted accordingly. Later, when process samples are analyzed, the accuracy of the analyzer's reading will depend on the accuracy of the calibration process. It is therefore, imperative, that we understand how error or contamination can be introduced through calibration; when calibration can - and cannot - address a perceived performance issue with the analyzer; how atmospheric pressure or temperature fluctuations can undo the work of calibration; and when and when not to calibrate.

There's More to This Story
Get more. You can read the rest of this story and other exclusive content as a Control Global community member. It's FREE, and it’s easy. We just need your name and email address. Then you can read everything you want on our site and even comment on it.

Author: Doug Nordstrom and Tony Waters, Swagelok Company  | File Type: PDF

Find more white papers on Field InstrumentationAnalyzers | CalibrationField Instrumentation : Process Analyzers

View all white papers»