CG1112-analyzer
CG1112-analyzer
CG1112-analyzer
CG1112-analyzer
CG1112-analyzer

Process Analyzers. Analyze This!

Dec. 7, 2011
Fewer Analyzer Specialists Are Left Working at the Plant Floor, Could It Be Because Newer Anaylyzer Systems Come Available as Complete Packages?

Greg McMillan and Stan Weiner bring their wits and more than 66 years of process control experience to bear on your questions, comments, and problems. Write to them at [email protected].

By Greg McMillan and Stan Weiner

Greg: What we ultimately want to know and control is the composition of a stream. The lack of analyzers to tell us this is the primary limitation to more advanced process control and optimization (See my blog post, "Top Ten Limitations – Analyzers"). I have connections with a strong ISA analysis division through my pH expertise. I recently interviewed Jim Tatera, the analyzer track chair. Jim retired from a long career as an analyzer specialist at a very large chemical company and is now the Senior Process Consultant at Tatera & Associates, Inc. Jim started out in analytical research labs and moved into online applications after developing an appreciation of the opportunities in production from discussions with operations and process support personnel. We are fortunate to have Jim and his colleagues help us appreciate the rich history and future of process analyzers over the next several months.

Stan: We had quite a strong group of analyzer specialists in the 1960s and 1970s in our large chemical intermediates plants with continuous processes for monomers and polymers. The batch processes had fewer analyzer applications. Newer agricultural batch processes with higher value-added products had slightly more analyzers than specialty chemical batch processes. Mature batch processes relied almost entirely on lab analysis.

Greg: I just talked to an analyzer specialist at a nylon plant I supported in Florida and found that there was only one analyzer specialist left. I asked why. The remaining specialist said the number of applications has stayed about the same, but they are adequately supported by four technicians. The peak in analyzer specialists in the 1970s was the result of the need to develop, design and build analyzer systems including circuit boards. Now these analyzers are available as complete packages from analyzer suppliers with integrated smart microprocessors. Were we a special case?

Jim: Most of the mature technology today for analyzers was developed at chemical plants and refineries by the 1970s. End users designed, constructed and assembled the components, enclosures, sample systems, circuits and interfaces. The analyzer technology was subsequently licensed to today's analyzer suppliers when the process industries in the 1980s decided oil, gas and chemical production, not analyzer manufacturing was their core competency. In some cases the licensing of technology included a 50% ownership of shares in the original analyzer companies. Many of these companies were then bought by large automation system suppliers. The numbers of analyzer specialists have dramatically declined since complete packages can be purchased that are capable of being supported by technicians. One concern is that the few specialists left onsite or as retired consultants may not be around in 10 years, stifling new analyzer applications.

Stan: How did you find new analyzer opportunities?

Jim: We had temporary relocate-able process analyzers (TRPA) that we got donated to the analyzer group from both process and pilot plants. They had been deemed not needed for the production unit after sufficient process knowledge was gained. We could trial an analyzer application for only the installation and maintenance cost. Some production units would donate an analyzer if they found they could control the process with a more common and less costly measurement, such as pressure or temperature. Some production units refused to give them up even if the information was not being used. Another source of opportunities came from process engineers who saw a need or opportunity to better control their processes and came to us for assistance.

Greg: Where were analyzers essential for control?

Jim: Analyzers were needed on high-purity columns because temperature was not sensitive to the parts per million of impurities that needed to be controlled. In some cases, we were able to have several optional sample points installed that proved valuable because steady-state process simulations were not accurate enough to show the best tray for composition control. Another common application was fluidized bed reactors to optimize turnarounds for catalyst replacement. Before the analyzers were installed, the catalyst was changed out based on a time schedule, regardless of the degree of poisoning or decrease in activity of the catalyst. The result was the cost of the loss of yield and capacity when beds had unexpectedly deteriorated or the cost of catalyst and the loss of production when beds were still productive. In one case a $250K analyzer installation paid for itself in six months in cost savings.

Stan: What was your favorite analyzer?

Jim: If a sample was vaporizable, my first choice was usually the gas chromatograph (GC) because the fundamentals and maintenance were relatively simple. The optional detector and column configurations allowed us to frequently provide the process with a reliable, maintainable, specific and appropriate measurement. A technician could understand and maintain the GC. Often maintenance consisted of timing adjustments or simply replacing sample valves.

Greg: When did you have to pursue other analyzers?

Jim: When the sample wasn't vaporizable, had solids and bubbles, or required special sample preparation that would add excessive cost and time delay, we went for spectroscopic techniques. IR, UV and Fourier Transform Infrared (FTIR) had strong enough signals to use selective optical detectors. The weak signal from near-infrared (NIR) and Raman spectroscopy required chemometrics (data-driven multivariate statistical models), such as partial least squares or projection to latent structures (PLS) models. NIR and Raman analyzers have been developed for many applications and have the attractive feature of in situ probes advertised as eliminating the need for a sample system.

Stan: Why isn't NIR and Raman your first choice?

Jim: These analyzers are much more difficult to understand and troubleshoot. I can kind of explain the function of an FTIR, IR and UV analyzer to technicians in terms they can easily relate to, such as a prism separating the colors of light, a microwave cooker, etc., but when I start to talk about statistical models and billions of calculations in a NIR or Raman analyzer, eyes glaze over. The technician is almost entirely dependent upon the supplier for technical support and the analyzer specialist, as well as model support if the PLS models were developed by the supplier. The development of these models requires special expertise despite software claims to the contrary that all you need to do is feed data to get wonderful results. A graduate degree in statistical methods is often a prerequisite.

Greg: Besides the maintenance issue, what about performance?

Jim: As with any statistical model, the prediction is only as good as the comprehensiveness of the samples used to develop the model. Often production units are not allowed to be deliberately run to cover entire spectra of possibilities, particularly abnormal conditions. The plant's data is also not truly random and has patterns and internal correlations, often from closed-loop control. Lab samples can be prepared, but differences in temperature and unknown components can invalidate models. Cross-correlated or coincident inputs can lead to erroneous relationships. The inputs (components) may not be independent variables. Recently the use of samples with a spike of the pure component of interest has been recommended to break offending autocorrelations. A PLS model is indicative of a correlation between the components and spectra, but not a guarantee of cause and effect.

Unlike the use of (PLS) models for production unit analysis, there are no first principles that can be readily be used to evaluate whether the identified correlations are deterministic or accidental. It may take years to develop enough data that covers all of the possibilities and to gain confidence in the results. For example, a NIR used on a tank farm took several years to be valid due to variation in seasonal temperatures, but could still be fooled by an exceptionally cold winter. I have been burned by an NIR that said the process was running great when in reality it was running amok. Generally, the use of NIR or Raman analyzers developed for your proprietary concentrations should only be used for advisory or restricted trim control until there are more than two years of valid results. This is not to say NIR is a last resort, but rather a choice when more selective techniques are not available. The cost of a GC is about the same as NIR. A sample system is usually required for a GC and may be required for NIR to prevent coated probes or noisy spectra, but the sample system design is generally less extensive. However, the maintenance cost of a NIR can easily exceed any savings in the sample system. Mass spectrometers can be more costly. The motivation as to analyzer selection is usually not as much cost-related, but is what works. We used NIR to help monitor and control polymerization condensation reactions and some blending applications.

Stan: Where has NIR been most successful across the process industry?

Jim: NIR is the standard for octane analyzers because of the maturity, generic nature and definitiveness of the application. If there is a big enough market, the analyzer manufacturers will develop robust models, updating them as necessary.

Greg: For the prediction of corn fermentability and, hence, essentially yield, a near-infrared transmittance (NIR-T) analyzer has been developed by Monsanto that enables a reduction in the carbon footprint and corn use that is more than 50% of ethanol production cost. Since ethanol is obviously not present in the corn, and the first principle relationships between starch and nutrient content and the conversion to sugars and ethanol are not sufficiently defined, a statistical model is an obvious choice. Monsanto updates the model yearly and freely provides them to ethanol producers to make the industry more efficient. I developed a simple production rate controller that immediately reduces corn feed rate to take advantage of an increase in predicted fermentability using the enhanced PID developed for wireless (See Control, May 2009, p. 54, Is Wireless Process Control Ready for Prime Time?.)

A running average of batch time to ethanol end point is used to automatically correct the predicted ethanol yield online. 

Stan: In future months we cover other analyzers, such as viscometers, microwave and nuclear magnetic resonance, get a perspective of emerging opportunities in batch and continuous processes, and look at how to revitalize end-user analyzer expertise.

Greg: We conclude with a poignant Top Ten List.

Top Ten Reasons Not to Buy an Analyzer

(10) Don't want to upset your friends in the lab by second guessing them
(9) Some things are better left unknown
(8) Analyzer specialist looks awfully old
(7) Can't toss analyzer in the back of a pickup truck
(6) Like playing "Who Dun It?"
(5) Temperature trends are a lot smoother than analyzer trends
(4) Don't want some fancy new advanced control system
(3) Don't need real-time optimization; we are running the best we can
(2) Cool new software can predict everything by just feeding it all your data and pressing a button
(1) Consulting firm says if you stop buying analyzers, you can afford bigger executive bonuses.