Twenty-plus years ago, a lot of smart and experienced instrument engineers saw no value in the new generation of microprocessor-based "smart" transmitters. The premium one paid for a smart device bought you a certain amount of flexibility to rerange your instrument within the bounds of its rated capabilities. To do so, you also had to buy a handheld communicator such as the HART 268 or the Honeywell SFC.
Back in the late 80s, the idea of "configuring" an instrument wasn't really part of our consciousness. You "calibrated" an instrument for a service by applying standards on the test bench in the shop. You turned zero and span screws or trim-pots while observing the 4-20 mA (or even the 3-15 PSI) output. It was frequently an iterative process because diddling one trim pot usually shifted the adjustment you just made at the other end. If you did it enough, you might get pretty good at anticipating how much to over- or undershoot a span or zero adjustment to improve your rate of convergence.
The whole concept of calibration was a bit of a misnomer, since the standard of the day was a dead-weight tester, which either wasn't used, and/or hadn't been sent out for recertification in years, if ever. I remember our techs using a Wallace & Tiernan box somewhere between the size of a briefcase and a carry-on bag. It had an 8-in. or 9-in. circular gauge in it and weighed about 20 pounds. While these "calibrators" were fairly precise when shipped, the abuse they took probably rendered them increasingly less so over the months between repairs. After all that, your DP cell was connected to an orifice meter run that was probably sized in 1968 using a slide rule. You can imagine there wasn't a lot of reverence about precision, or even a belief it was possible.
The smart transmitter and its requisite handheld communicator were understandably not a natural extension of these practices. The W-T gauge boxes of that day didn't have LCD displays or "soft keys." The number of steps and complexity of setting up—configuring—a smart device were viewed as confusing and convoluted compared to simply turning zero and span screws. Some of our suppliers even added zero and span screws to their digital transmitters, because the acceptance of handhelds and soft-keys was becoming a barrier. "If your guys like having zero and span screws, here you go," said the sales person. Transmitter manufacturers were just beginning to figure out characterization of sensors, so often a re-ranged transmitter retained only a fraction of the accuracy and repeatability of the original calibration, so you needed to recalibrate it anyhow if you wanted to preserve the quoted accuracy and repeatability.
But recently, it's as if that whole decade of handheld heartburn never happened. Today, it's as if all trim pots and zero screws have finally fallen off like vestigial tails. Having finally accepted the fully digital transmitter, we've begotten a new generation that can't be separated from their handheld communicators. Now one can sling a NIST-traceable calibration device over his or her shoulder, and carry a wireless laptop or tablet in the other hand.
When a recent start-up called for a calibration check of a DP flow transmitter we only use for a day or two every two to three years, these were my weapons of choice. The Rosemount 3051C DP cell had been calibrated 0 in. to 100 in. at the factory in 1999, and hadn't seen a workbench since. The rusty plug in its upward-facing unused conduit connection was a little disconcerting, but it was still communicating without errors on its segment. What was remarkable was that, after more than a dozen years of rain, snow, ice, sun and temperatures from -15 °F to +105 °F, it was still reading ±0.05 in. of water column or better.
Twenty years ago, we couldn't foresee the remarkable strides that digital technology has made in the accuracy, reliability and stability of today's field devices. It's worth considering what novel technologies we dismiss today that could become extraordinary achievements in the coming decades.