1904-Feat2-leadimage

Process monitoring, analysis gain intelligence, but still need acceptance

April 25, 2019
Many types apparently continue to be viewed as outside-the-loop devices that can only contribute indirectly to process optimization or are too costly and inaccurate

Process analyzers are still too often on the outside looking in. Even though they've added innovations, automation, new capabilities and ease-of-use in recent years, many types apparently continue to be viewed as outside-the-loop devices that can only contribute indirectly to process optimization or are too costly and inaccurate. These old prejudices are unfortunate because analyzers, sampling systems and similar solutions can greatly improve productivity and competitiveness of process applications, but these benefits are only gained if potential users recognize their value and invest in them.

"We’re seeing a number of global megatrends affecting analyzers, including more stringent emissions regulations and the need for process controls that give users more and better data that adds value to their processes” says Trevor Sands, president of Servomex. “Consequently, we’re continually enhancing our analyzers to ensure they keep providing quality usable data required, while remaining highly stable and easy to maintain. We know users want analyzers they can install and leave alone without frequent recalibration, and that’s what we aim to deliver."

John Kerney, natural gas marketing manager, Ametek Process Instruments, adds that: "We're seeing solid growth globally for process analyzers as end users drive for improving process efficiency, optimizing process throughput, and meeting safety and environmental requirements. In the oil and gas sector in particular, feedstock availability is more expansive and in some case less expensive, which is supporting capital expenditure for expansions of existing plants and construction of new ones. Users are also facing an increasing regulatory environment both in regulation and enforcement, which is fueling the need for more process monitoring and control. Finally, workforces are aging and retiring, and users are asking for analyzers that are smarter, simpler and easier to maintain. When users evaluate the selection of process analyzers for projects, capital costs were historically the key driver. In our industry, we're seeing growing interest in understanding the total cost of ownership (TCO) for analyzers. Even though the capital costs of a particular analyzer may be higher than an alternative, users see the benefit of lower TCO for an analyzer that delivers better reliability, lower maintenance and greater access to data."   

Ages of analyzers

Similar to the mainstream process control application they serve, analyzers have many twists and turns in their  history, too. 

"The advent of programmable logic controllers (PLC) in about 1969 also marked the beginning of process analyzers that were basically adaptations of benchtop instruments," says Marcus Trygstad, advanced analytical technology consultant, formerly of Yokogawa Corp. of America. "In the 1980s, gas chromatographs (GC) took off because they were simple and perfect for composition analysis in hydrocarbon processing. Over the years, process analyzers used in refining were mostly automated versions of lab analyzers. But with the introduction of tunable diode lasers (TDL) in the 2000s, advanced analyzers began moving from shelters, and became more pipe-centric because they could be installed at or near sample taps. More recently, TDLs were joined by Raman, quantum cascade laser (QCL) and other spectroscopic techniques.

"Now, the analyzer silo is showing signs of being subsumed into the automation whole and the Industrial Internet of Things (IIoT). Where discrete analyzers previously produced measurements that were considered alone, digitalization means analyzers and their data will be selected, prepared and digested with one big data meal. In my view, Process Monitoring (PM) 4.0 is about technology fusion and the execution of analysis in broader digital environments."      

Emulating expertise

Because analyzers are traditionally implemented and maintained by a smaller subset of process control professionals, they're also more deeply impacted by the field's accelerating retirements. As a result, many suppliers are trying to offset diminishing expertise by making their analyzers easier to use.  

"The older analyzer technicians are retiring, and there's not a lot of people to replace them, fewer support staff, and less-experienced replacements," says Tracy Doane, U.S. product manager for analysis products at Endress+Hauser. "For wet chemical and gas measurement, we're developing easier to use analyzers that can still perform high-precision measurement, but have simpler interfaces, and easier access for maintenance and remote support. This drives our R&D, so while basic measurements like ammonia remain the same, how we gather data and what's done with it is changing."

For instance, Doane reports Endress+Hauser partnered with Dow Chemical several years ago to take over pH measuring points handling electrolysis in the chlor-alkali process at its integrated chemical complex in Freeport, Texas, fitted its measuring loops with Endress+Hauser's Memosens digital sensor probes, and devoted three staffers to maintain them.

“The analog measuring points were causing a lot of trouble there,” says Doane. "The high-resistance signals were susceptible to faults and malfunctions. This caused the maintenance team to put in plenty of nonscheduled working hours, but Memosens let Dow upgrade to new technology on a common platform." Memosens converts measured values to digital signals inside the sensor, while their probes rely on magnetic induction for power and signal transmission. "This tackles the major problems in pH management at their roots," she adds.

Eventually, Dow's supported pH loops increased five times to more than 900, while Endress+Hauser technicians look after multiple facilities in Texas. Meanwhile, Memosens sensors are supported by Endress+Hauser's Liquiline M CM42 two-wire transmitter for pH/ORP, conductivity or oxygen, Liquiline CM44 multi-parameter transmitter or Liquiline CM44R DIN-rail analyzer for parameters covered by the other two, plus disinfection, turbidity/TSS, ammonia and nitrate.

“We’ve triggered a continuous improvement process,” says Doane, who adds many of the measuring points will be gradually retrofitted with wireless signal transmission using WirelessHART. “We want to get to the point where we centrally analyze all the sensor status information to improve maintenance management even further.”

Less staff? Streamline with software

To compensate for retirements and dwindling expertise, users and suppliers are developing analyzers that are easier for less-experienced users to implement and maintain.

"No process applications can run without analyzers, but they're facing the same demographic trends as other technologies—aging personnel, retirements and lost knowledge," says Jean-René Roy, global product group manager, analytical and force measurement, ABB. "However, it's difficult to find people with the right skills and analyzer knowledge. Plus, it's often hard to justify the operating expenditures (OpEx) to maintain analyzers."

As a result, ABB has been realigning its focus with its ABB Ability digital platform, and in the next few months, it will launch its Analyzers on ABB Ability products and services, which will let users implement most analyzers, including continuous gas, process GCs and Fourier transform infrared (FTIR). "Analyzers on ABB Ability will provide automatic health checks of analyzers, remote assistance and troubleshooting, and cybersecurity," explains Roy. "This will allow more condition-based maintenance, instead of just scheduled maintenance, which means more analyzer uptime by predicting problems and better overall resource management."       

In PM 4.0, analyzers lose their identity as standalone measurement devices and are integrated with software-based sensors. Yokogawa's Trygstad sees their combination as a way for each to offset the weaknesses of the other. "Many engineers hate ‘hard’ analyzers, but love inferential soft sensors because the models are cheap to implement and provide nearly instantaneous readings," he says. "Yet, the models use process parameters (manipulated variables) that relate only indirectly to the chemical properties of interest (control variables). The circumstantial correlations have good, short-term precision but poor long-term robustness. What’s needed is a way to automatically keep these models honest. That’s what PM 4.0 is about.

"The usual approach is to grab samples once per shift or once per day, and apply a temporary offset using the lab result. However, the problem is those lab results represent solitary, disconnected moments in the process. Using soft and hard analyzers in combination provides mutual validation. More important, it reduces the measurement uncertainty by 80%, making it possible to run the process closer to the control limits. This can reduce energy, improve product yields or quality, or prolong catalyst life. Using analyzers in conjunction with soft sensors can create a solution that's better than its parts and provide much better fidelity and process control."

Proper payback

Though they still feel like voices in the wilderness, advocates continue to stress that properly designed and implemented analyzers are good investments that can be hugely beneficial to users, improve their safety and are increasingly easy to employ.

"Existing analyzers have been beefed up with more computing power and algorithms, so they deliver more valid data with the same hardware," says Robert Sherman, principal technologist, Enterprise Consultants International. "However, the general, senior-management prejudice persists that analyzers are too costly, need lots of maintenance and won't work anyway, even though they're faster and more accurate than they used to be. The reality is comprehensively engineered, meticulously installed and properly calibrated analyzers can be valuable tools contributing to profitable operations."

For instance, Sherman recently observed the sample probe and heated sample transfer line on a heated-cell paramagnetic oxygen analyzer monitoring atmospheric vent gas from a crude distillation vacuum tower wasn't properly insulated to prevent water vapor from condensing at cold areas of the poorly insulated line. Just one drop of water could destroy the $3,000 heated paramagnetic analyzer cell. "This probe and analyzer helped prevent the oxygen level in the tower from reaching an explosive level, so they were essential for unit operations. However, in this case, it was taking approximately 24 man-hours each day to drain water from the sample conditioning system condensate trap. One drop of water swept off the top of this condensate trap and the analyzer cell would be destroyed. The plant was replacing approximately 18 analyzer cells per year! It was maintenance nightmare," says Sherman. "A system retrofit (properly insulated probe, properly engineered and installed heat trace, properly insulated sample transfer line and a heated-and-insulated sample conditioning system enclosure) resulted in totally reliable analyzer operation with no heated cell failures. The analyzer validated daily via an automated routine, and no analyzer system maintenance was required to be performed for 30 months."

Sherman adds that analyzers have gained computing power (signal averaging, more precise temperature control and better flow stability) due to advances in the microprocessors they've added internally during the past 5-10 years. "As their on-board computing power increases, analyzers can do more calculations, take more measurements, and analyze more data for more precise results," says Sherman. "Previously, a typical analyzer might handle signals via 4-20 mA from one data point in one second. Now, users can get 100 data points per second, average them every 30 seconds, use process tools to identify statistically insignificant outliers, produce statistically significant average values, and provide much more reliable and precise data for use by process operations.

"Likewise, where spectrometers using lamps and gratings used to get a couple of wave numbers at about 1-nanometer resolution, they're now using lasers and getting 1/10-nanometer resolution for much more precise analytical results. However, since a laser-based analyzer is looking at a smaller part of the spectrum, multiple analyzers are needed to monitor different substances, such as methane, carbon monoxide and oxygen in fired heater control applications. This is where TDLs can measure two or three compounds from one “analyzer head” because they have two or three lasers (analytical engines) in one analyzer head."

[Please note: Bob Sherman also contributed to "Water-in-oil analysis uses controlled vortices," Control, Apr '18, p. 56.]

Mike Garry, product manager for process spectroscopy, Thermo Fisher Scientific (www.thermofisher.com), adds that, "Laboratory instruments measuring grab samples from the production process don’t provide immediate answers, and may not give an accurate assessment of the in-process materials. Process spectrometers, such as near infrared (NIR), infrared (IR) and Raman, use optical spectroscopy to monitor chemicals directly under actual process conditions. This proximity provides real-time feedback on the materials at the molecular level, ensuring consistent production and allowing conditions to be changed on the fly to improve yields, reduce waste and ensure product quality."

Garry reports he and his colleagues at Thermo Fisher Scientific are observing six process monitoring trends, including:
• Large increase in 2018 capital investment for online analyzers to help improve production quality, save labor and reduce costs;
• Continuing migration of analytical functions from lab to linel;
• Expansion in the use of process analytical technology (PAT) in polymer and specialty chemical markets;
• Increased use of analyzers by large, multinational companies at more worldwide locations;
• Continuing growth in the biopharmaceutical market, with a need for different types of analyzers; and
• Global warming and environmental initiatives expanding the use of continuous gas emission monitoring in industrial and transportation sectors.

Garry adds that Thermo Fisher Scientific has been responding to these trends and challenges with several innovations, including:
• Expansion in the use of Raman instrumentation;
• Improvements in sample handling using fiber-optic probes;
• Smaller, lighter-weight spectrometers; and
• Direct coupling of instrument technologies to perform multiple measurements on the same sample at the same time, such as adding Raman to an X-ray surface analysis spectrometer, or combining a rheometer with and FTIR or Raman to provide chemical and physical measurement.

"For instance, when measuring additives in polymers, a grab sample is commonly taken to the laboratory where it is analyzed using IR spectroscopy," explains Garry. "A customer wanting more immediate feedback and control of their process put an FTIR instrument at line and automated the analysis, to obtain feedback every few minutes. The instrument displays result with easy to interpret pass/fail indications and automatically sends them to the plant control room allowing operators to quickly flag any problems and address them quickly. These instruments contribute to the plants return on investment (ROI) by reducing the risk to the manufacturer of shipping out-of-spec product which may result in recalls. By investing in these analyzers, they can evaluate product more frequently and obtain more consistent quality.

"Recent innovations resulting in improved instrument hardware, software integration, data handling and network communications are leading to better system utilization. For example, our new Nicolet Summit FTIR spectrometer has active visual feedback using light bars that ensure the operator the instrument is working properly and they're getting good results. Further, the system can connect wirelessly to smart phones, tablets and cloud-based applications. While this system is not a process analyzer, it shows where we're going with our instruments and the ability to monitor their own health, and use the latest technology for data connectivity. There's resistance to this type of connectivity within the industry because companies want their data to stay securely behind their firewall, however they're operating more and more in a connected global environment. The availability and improvement of secure clouds will allow them do this with more confidence as we move into the future."

Lasers living large

Chief among the more recent advances in process analyzers is the emergence of laser-enabled measurement, which logically grew out of earlier, similarly light-based methods.

"Historically, many analyzers used non-dispersive infrared (NDIR), chemiluminescence or paramagnetic oxygen devices, but they were often complex and prone to drift," says Dave McMillen, North America business development manager, Rosemount Quantum Cascade Laser (QCL) gas analyzers, Emerson Automation Solutions, which acquired Cascade Technologies and its QCL equipment in 2014. "Now, we're using semiconductor-based lasers that produce light on a chip, giving them the wavelength for the molecule our analyzer is seeking, and using absorption spectroscopy to find those molecules and measure their concentration. We're just using a different type of light source, but it's generally less complex and more stable, so maintenance costs less, too."

For instance, BP’s Cherry Point refinery in Blaine, Wash., has completed a 50-day stability trial of Emerson’s Rosemount CT5400 hybrid QCL/tunable diode laser (TDL) analyzer to characterize flue gas from its calciner hearth for environmental reporting. The analyzer is installed after a thermo-electric chiller to remove excess moisture from the flue gas, which is drawn through the analyzer with an eductor. The analyzer performs all the measurements typically done by chemiluminescence, infrared, ultraviolet (UV) and paramagnetic technologies to analyze O2, CO2, CO, SO2 and NOx.

“All of our heaters have operating permits issued by EPA and state regulators, which specify emission limits and require continuous emissions monitoring system (CEMS) testing,” explains Ryan Holgate, analyzer engineer, maintenance, BP Cherry Point. “This testing includes daily validation of analyzers with check gas, quarterly stability testing against check gases, and yearly relative accuracy test audit (RATA) of stack gases compared to a third-party reference. In our daily validation testing over 50 days, all operations were well within limits, and we found QCL to be very stable.”

Holgate reports that BP Cherry Point’s annual RATA compares CEMS data to information collected from an independent EPA test method, and passing a RATA means matching the test method, though it’s not necessarily a measure of accuracy. “A RATA usually includes nine runs at 21 minutes per run, and data is recorded at one-minute averages,” adds Holgate. “The tester performs analyzer bias checks between each run and traverses the stack diameter to eliminate any stratification concerns by moving the probe every seven minutes. “We needed to match the performance of the EPA’s existing testing method before we could adopt QCL technology. We did 10 runs, and we passed for all gases—O2, CO2, CO, SO2 and NOx—within EPA’s specs. During the 50-day trial, QCL showed it was stable, met the EPA’s performance requirements, and had lower maintenance requirements. We anticipate QCL/TDL will give us greater data availability with less downtime, and QCL has already been approved for our next two heater applications.”

Because analyzers used to require more maintenance and were more costly, McMillen adds many potential users were sometimes reluctant to deploy them, even though they can provide quality, environmental and process monitoring that can optimize many operations. "I used to work with pH analyzers, and many people thought of them as black magic. I think this was because they involved more chemistry and were more complex, so they required more experience, and it was more of an art to use them," he says. "Technicians can learn to use transmitters pretty quickly, but analyzers take more time, and require more of an apprenticeship approach."

More recently, these learning curves for analyzers have been shrinking thanks to technical advances like QCL and others. "Older analyzers can have errors, and you may not know it," says McMillen. "The advantage of QCL is that it can't be in failure mode without the user knowing it."

The latest analyzers are also returning results more quickly, which lets users make decisions fast, and they're more effective decisions, too. In fact, Emerson just launched its Rosemount CT4400 continuous gas analyzer, which it reports is the first purpose-built QCL/TDL analyzer designed to help plants reduce ownership costs and report emissions accurately in environmental monitoring applications measuring standard components, such as NO, NO2, SO2, CO, CO2 and O2. Because it can hold as many as four laser modules, CT4400 can simultaneously measure up to seven application-specific gas components, providing flexibility in CEMS applications. This simultaneous, multi-component analysis in a single analyzer reduces the need for multiple analyzers and the cost of using them.

"We're aiming to get the analysis point closer to the sampling point, and having more and smaller analyzers in the same sized box, being field-mountable and being certified for use in Class 1 Division 2 classified areas is making it possible because they're also less likely to need shelters and other costly support equipment," concludes McMillen.

Regulation response

Another prime mover in the analyzer field is government regulations, which often grow more strict in particular technical areas or geographic regions, and spur the development and adoption of new analyzers.

"Because less-experienced users still have to do more with less, liquid analyzers are also getting easier to operate and maintain. However, governments are also cracking down on substances like phosphorus in wastewater treatment plant (WWTP) discharges," says Steve Smith, senior product manager for liquid analysis, Endress+Hauser. "Traditionally, WWTPs base the poly aluminum chloride (PAC) dosing in their phosphorus removal applications on a lab's labor-intensive, grab-sample analyses. However, we recently developed our Liquiline CA80 orthophosphate and total phosphorus analyzers, which deliver lab-quality results."

Figure 1: Mountain View Wastewater Treatment Facility in Wayne Township, N.J., recently implemented Endress+Hauser's Liquiline CA80TP total phosphorus analyzer to automatically take samples every 45 minutes, achieve lab-quality results in the field, optimize poly aluminum chloride (PAC) dosing, and comply with more stringent discharge requirements of less than 0.76 mg/L of total phosphorus. Source: Wayne Township and Endress+Hauser

For example, the 13.5-million-gallon-per-day Mountain View WWTP run by the Division of Water and Sewer in Wayne Township, N.J., has to discharge less than 0.76 mg/L of total phosphorus in the warmer months due to regulations that became more stringent in 2017, and PAC dosing was the only way to comply, but this required accurate, near-real-time data for optimization, as well as an analyzer that wouldn't clog, overdose or break down every week (Figure 1).

After trying multiple analyzers that couldn't free it from grab sampling, the township's WWTP deployed Liquiline CA80TP in August 2017 along with a Y strainer with 8-mm ID suction tube to extract samples. "Liquiline CA80TP was easy to install compared to other analyzers we tested,” says Howard Breder, lab manager, WayneTownship. “Samples are taken automatically every 45 minutes, and large solids are automatically filtered out by the Y strainer, which prevents the clogging we experienced before."

Liquiline CA80TP lets the plant rely on its measurements, adjust PAC dosing in near real-time, and see total phosphorus trends in close agreement with grab-sample analyses. The township's staff is confident the analyzer will provide useful predictive alarms to warn of low reagent conditions and let them resolve faults quickly, resulting in minimal loss of measurement time.

"When the phosphorus level is starting to rise, we can see it in near real-time because samples are taken automatically every 45 minutes,” adds Breder. “We go in and make the adjustment right then. These samples yield accurate data and quality results onsite. We have real data that agrees with results completed in the lab, which take days to process. The end goal is a total phosphorus number you can hang your hat on.”

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.