68d166efe25450d7ae0270bd Shutterstock 568918414

Tying together analytics locations, formats and functions

Sept. 22, 2025
Hargrove Controls & Automation shows how users’ needs should determine analytics types and deployment

One of the most basic decisions to make when implementing a data analytics solution is where it should perform its data processing, and what types and how much to do there? On the edge? In the cloud? A combination of both, and if so, how much in each location?

“Data analytics software has been evolving from premium, niche products offered by only a handful of vendors to mainstream products offered by a wide range of vendors at various price and performance levels as more vendors see the market potential of their solutions. There truly are solutions for almost any company and use case,” says Heath Stephens, PE, digitalization leader at Hargrove Controls & Automation in Mobile, Ala., a certified member of the Control System Integrators Association (CSIA). “Potential users shouldn’t assume they can't afford to pursue data analytics and AI applications. Both Amazon Web Services (AWS) and Microsoft Azure offer a wide range of data analytics tools at low-cost monthly subscription rates. Process control software companies are also offering many applications that bundle with their basic process control system (BPCS) platforms or can bolt on to competitors' products. Finally, large language model (LLM) AI tools are making headway in the industrial sector by merging ChatGPT or Alexa-type interfacing capabilities with process data.”

Form follows function

Stephens reports that data analytics software packages can be installed either as on-premise solutions or cloud services, depending on the product and application. Since most source data is generated onsite, some locally installed hardware or software components typically collect the data.

“From there, users can choose where to store data depending on the data analytics software they select: physical machine, local virtual machine (VM), corporate-cloud VM, hosted-cloud VM, or a hosted cloud service provided by the data analytics supplier,” explains Stephens. “This choice is affected by many factors, such as available software options, existing IT infrastructure and support, preference for hardware purchases versus subscriptions, and locations of data sources and users.

“Similarly, data analysis and visualization can be performed locally or in the cloud. While certain applications still require software to be installed locally on individual users' PCs, many applications are now web browser-based, allowing a hosted solution from a client's server or a cloud server provided by a third party or the software vendor.”

Stephens adds, while capabilities have evolved, the biggest changes in data analytics recently are its ease-of-use and affordability. “The truth is that most industrial data analytics use cases are only moderately complex in terms of the number of variables to analyze and the calculation intensity required compared to applications in fields from weather modeling to gaming to high-frequency trading,” says Stephens. “However, most industrial companies can’t afford a dedicated team of data scientists, so making data analytics software user-friendly and intuitive is essential. Even if a company chooses to outsource some or all of the initial installation and setup or ongoing support, having a platform easily understood and overseen by the operating company is important for ongoing success.”

Get your subscription to Control's tri-weekly newsletter.

Deeper details for proper installation

After determining data processing locations and information types, Hargrove’s Stephens reports other issues must be addressed to bolster appropriate and effective analytics, such as performance limits and context of each process application and its operations.

“We’ve recently done several reliability, availability and maintainability (RAM) studies for clients. These are a little different, in that most people think of data analytics as something you do with live process data. RAM studies are offline studies of process reliability for plants that may or may not be built yet,” explains Stephens. “You can think of these studies as theoretical overall equipment effectiveness (OEE) analyses. Tracking OEE is a great exercise, but if you don't know your theoretical maximums, you may be wasting effort trying to improve OEE with maintenance strategies alone.”

For instance, Hargrove recently engaged in a machine vision application for a process issue. “Machine vision is often thought of as a tool for manufacturers of parts and other discrete items or for agricultural processors that need to sort or compare items for quality and consistency,” adds Stephens. “However, there are many applications in process industries as well, such as monitoring flare stacks, personnel safety or process equipment that use either traditional visual spectrum cameras, infrared cameras or other specialized, false-color imaging devices.”

To help users navigate these complex and often thorny issues, Stephens recommends enlisting some competent assistance to identify the most appropriate analytics solutions, and apply them for the greatest positive impact.

“The best advice I can share is to choose a good partner for any project or larger program effort. This may mean working with a vendor directly or with a system integrator,” says Stephens. “Working with a vendor directly may be the simpler approach for straightforward, smaller projects. However, system integrators can assist with vendor selection or tie multiple projects into a cohesive program. They can also act as force multipliers, helping augment company staff to do more, faster, as you roll out a project across a plant or enterprise.”

Stephens concludes that data analytics will get easier going forward because its varied tools will continue to get simpler and more affordable. “I think the most significant changes will be in consumption/visualization, where improvements in how we request and view data will be greatly simplified by using LLM interfaces to extract and display data,” adds Stephens. “We’ll be able to ask questions about our data verbally, like we see characters do on Star Trek. However, even though all the great things we predict will come in the future, no one should continue to wait to start their data analytics and AI initiatives. Great tools are available now and can help improve your processes and profitability. Continuing to wait puts you further behind your competition.”

Jim Montague | Executive Editor
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

Municipalities are utilizing inline total solids measurements to enhance sludge thickening, lower polymer usage and cut operational expenses.
Carbon dioxide is increasingly recognized as a vital resource with significant economic potential. While the conversion of carbon dioxide into products is still in its infancy...
Discover our wide range of temperature transmitters that convert sensor signals from RTDs and thermocouples into stable and standardized output signals!
An innovative amine absorption-based carbon capture process enables retrofitting of existing industrial facilities to reduce emissions in hard-to-abate sectors, with advanced ...