CT2105-Cover-data-analytics-hero-v3

Dutch uncles on data analytics

May 28, 2021
Supplier experts from Emerson, Fluke, GE Digital, Honeywell, Radix, Softing and TrendMiner give users the straight dope on understanding and implementing useful data analytics.

Even though all kinds of software and other tools are readily available, data analytics can still seem like a vast, monolithic and unapproachable topic to many end users. How can I use all this stuff to help my individual process application? Well, there's plenty of good advice, too. Here's some of the best available recently.

Emerson

"Most users have well-established network protocols in the field from the different devices and systems already running there. The challenge now is getting more data from the field to the cloud for analysis," says Logan Woolery, global product manager for Emerson’s Pervasive Sensing business. "This is driving the expansion of edge computing, which is consolidating onsite data, maybe conducting some initial analysis and calculations, and then translating that data to a common cloud language."

Woolery reports that Emerson's IoT Connectivity solution consolidates data and delivers it to cloud-computing services. This is usually done with an industrial PC (IPC) at the edge, such as Emerson's Machine Automation Solutions (MAS) RXi2 , which runs next to its gateways and plant-floor components, and uses a cellular router to connect to Microsoft's Azure IoT hub service. "In the past, a plant might have 20 gateways, each with 50 devices," he says. "This meant users would need to manage 20 data connections to those gateways, handle their middleware and higher-up links, and try not to lose data fidelity or richness during the conversion process.

"This is easier with solutions like IoT Connectivity because middleware is no longer needed and the application can connect directly to the cloud, which allows all 20 gateways to use one IoT Connectivity software broker," explains Woolery. "This is easier to manage because it's more uniform, efficient and secure. It also improves data fidelity and richness because connections aren't losing data across multiple protocol conversions. Many users want to analyze large amounts of data in the cloud, and now they can do it with software that allows them to manage multiple data sources and endpoints through a single-point solution."

Although it's often used as a buzzword, Woolery adds that IIoT consists of three main parts:

  • See begins with sensors and their real-time, time-series data regarding basic operating parameters, energy usage, reliability, and safety. This aspect may only begin with monitoring a single asset or process area in the facility, but can be easily scaled to quickly expand to cover more assets across several areas.

  • Decide is where analytics are performed by programs, such as Emerson's Plantweb digital ecosystem, including Plantweb Insight and Plantweb Optics software. These platforms take in raw process data via a secure path, apply proven algorithms and calculations to make sense of it, and provide users with the information they actually care about most, such as an imminent pump failure or a heat exchanger that's overdue for cleaning.

  • Act is making a decision and planning maintenance or repairs based on the analytics provided to the user.

"The biggest shift we're seeing due to IIoT, data analytics and using software like Plantweb is illustrated in how the decision-making process is changing," adds Woolery. "Previously, decisions were reactive; equipment would fail and need to be repaired or replaced. Users would try to plan and budget, but it was very difficult. Now, their decisions are more data-driven, and enable plant management and technicians to take a more proactive approach to planning and maintenance. Plus, they can make more holistic decisions, which may consider larger operational issues or more underlying situations."

Fluke

Even though many users want to build algorithm-based analytics vehicles and artificial intelligence (AI) functions that can operate in the cloud, Brian Harrison, head of IIoT strategy at Fluke, cautions these innovations still require Internet of Things (IoT) devices and networking to reach it. "Our perspective is that IIoT is still the fuel everything runs on, while edge computing is a growing area of focus for condition-based monitoring and analytics," says Harrison. "There are a number of proven IIoT paths that predate these newer analytics and are by default further along in maturity. For example, our Connect2Assets bridging software is 16 years old and still evolving.  We are always looking at new options that can give users data where they want it, whether it's in a data lake or other repository, on a server, or on Amazon Web Services (AWS). We can't do one-size-fits-all analytics because its return on investment (ROI) is different for each process, facility and organization. Some need smarter alerts and alarms, while others need anomaly detections for more comprehensive analytics. Tailoring a journey to the business drivers and goals is critical."

Harrison reports that Fluke's Connected (FC) reliability and integrated IIoT framework consists of sensors and handheld devices, remote monitoring of PLCs and SCADA systems, and a computerized maintenance management system (CMMS) that can hand off asset-tracking data to the cloud or a data lake. However, users must also decide how much access to allow and to whom, overcome resistance to these solutions, and make other cultural changes for them to work and succeed. "The goal is to democratize information to ensure it will be high quality; eliminate garbage-in-garbage-out data; consider the value of information beyond initial readings; and use history and trends to decide where to invest in the future,' says Harrison. "This is all focused on getting actionable information to users, so they can analyze it in real-time to detect trends and make decisions that add value."

GE Digital

Mark Hu, director of data and analytics and product development engineering team lead at GE Digital, reports that demand for data analytics is increasing, along with requests for machine learning (ML) and artificial intelligence (AI) tools. However, this is expanding the size and scope of data analytics projects, as well as the capabilities and skills needed to make them happen.

"Data analytics used to be done on a smaller scale, such as one machine with three, five or 10 parameters. Now it's being done on larger scales with many devices that traditional analytics methods can't handle, so more capable hardware and software is needed," says Hu. "Modern analytics also requires support, networking, Internet and cloud storage. And, more information storage means greater data manipulation is possible."

To decide what analytics are needed and whether to perform them locally or in the cloud, Hu reports that users must begin by asking themselves what's their business' goal and how large is their application—one area of a shop floor or a whole fleet of facilities? "They also need to decide what data they require, what they need from the plant floor, and in what time frame? Information produced every second on the plant floor may only be needed once an hour or once per day in the cloud," explains Hu. "Users must also identify all the constraints they're facing, and use them to decide what analytics tools to employ. These constraints can include data types, top criteria, response speed, and edge issues like is 24/7 power needed, or is more computing capacity needed to run an AI model? If a fleet needs to be analyzed, users must decide whether to run analytics in a central location or the cloud. It's also important to establish reasonable expectations for what data analytics can achieve based on existing technology, and not over-hype or under-hype a project."

Hu adds that GE Digital has performed several micro, remote monitoring applications for early failure detection, and built complex models for monitoring that operate in the plant or run non-time-critical applications in the cloud. "This lets customers upload their data, and we send them alerts and reports," he says. "We also run control loops, PLCs and other optimization modules, and monitor PID controllers for real-time responses, so users can adjust quickly when needed. Basic process knowledge and how it applies to my specific use case is critical to analytics that can solve problems, and help steer users away from implementing just any fancy tool.

"Likewise, operator experiences and existing analytics can help users figure out how to embed AI in their decision-making and production workflows. Together, they can determine what production data is needed, design sub-steps for optimization, and identify modules where AI can play a role. GE does inspections for this that include image-recognition, natural-language processing and sensor data. This enables ML and AI functions to run on the backend, so users can build models in the cloud, and deploy them on the edge or in the plant to optimize their workflows."

Honeywell

"The process industries are realizing that cloud-computing tools like Microsoft Azure and Amazon Web Services (AWS) are easier to use, more accessible and more scalable for added use cases. However, their artificial intelligence (AI) and machine learning (ML) capabilities need to be combined with the right domain knowledge," says Anand Vishnubhotla, chief product officer for Connected Enterprise at Honeywell. "Much of the chemistry and physics in process applications produce a variety of data including time-series data (TSD), alarms, alerts, events and transactions. Historians and other devices have solved the problem of storing TSD—as much as 20 years worth at times.

"What isn't solved is combining that data with non-time series data that comes from manufacturing execution systems (MES) and other operations technology (OT) applications that don't save data. This is why it's still challenging to coordinate and contextualize information from these systems with other TSD and alerts. There are tools to look at data from one viewpoint and solve for it, but what users need is the ability to combine data from multiple different applications that address the same unit or operation to get unique insights."

Vishnubhotla reports coordinating multiple data sources was prevented in the past because legacy distributed control systems (DCS) and historians took sensor indications, and handled this data in ways that was difficult to combine. "Different data sources and applications need a layer that sits on top and can understand their relationships, and develop contextual models that show those relationships to users. Again, there are tools that can do this, but how do you also embed domain knowledge?" asks Vishnubhotla. "Honeywell has been working on ways to integrate information, so it can go across multiple applications and synthesize the knowledge buried in those applications. We describe the same temperature data, for example, so it's meaningful for different users, whether they're plant or optimization guys or managers. Our Forge software can automatically collect data using standard protocols, have an edge to cloud connectivity using our OPC solutions, clean the data and contextualize it, too.

"Next, we're working on building data relationships based on domain knowledge, and driving those enhanced datasets to the analytics side. We feel this hybrid approach is important because it lets users take advantage of different techniques they may want to employ. We're also exploring analytics tools for advanced users and data scientists, which also emphasize the user experience. This and the ability to rapidly deploy data analytics is becoming even more essential to production and end users' business strategies."

Radix Engineering and Software

"Lately, there's been a lot of talk about how to convert data into dollars," says Leo Learth, upstream program manager at Radix Engineering and Software. "Some analytics is simple, such as adding first-principle rules to an automation system, but sometimes it's more complex, and needs cloud-computing, ML and AI. To us, data analytics isn't much different than before. It's just there's much more data now, and it's being associated more closely with achieving business goals."

Learth reports that Radix handles three types of data: design and engineering, real-time operations and production, and transactions. "Each can be monetized in different ways depending on where it's at in the lifecycle," says Learth. "We focus on predictive maintenance and fine-tuning models. For instance, we recently helped Chevon Brazil by adding software infrastructure and onshore data visibility to its floating production storage and offloading (FPSO) vessel. It operates in the Frade field off the coast of Brazil, which Chevron sold to Petrorio in March 2019."

Radix and OSIsoft software were part of Chevron's overall condition-based maintenance (CBM) project in 2016-18 to digitalize its data for better monitoring and displaying its emergency valves and subsea critical limits when it operated in the Frade field. This monitoring employed OSIsoft's PI Event Frames software to detect, record and analyze events with well-defined start and end conditions; published monitoring displays with OSIsoft's PI Vision software; and achieved faster configuration and maintenance for the notifications. PI Vision also let Chevron's IT department  maintain its digitalized solutions with a governance model, so users could share information and collaborate. (Watch a video presentation in 2017 about OSIsoft and Radix's work on Chevron's FPSO project)

"The project also created digital versions of related equipment, including the entire FPSO topside, separators, vapor recovery unit (VRU), gas compressors, pumps and other devices," says Learth. "Dashboards were added that allow users to see onscreen what's happening on the FPSO. Once this digital foundation was in place, Chevron could build applications to solve business problems at that time, such as monitoring the efficiency of its process for injecting chemicals into its subsea reservoir. The software could determined if injections were good or bad, if tanks were too low, or if it was time to buy more chemicals. It also tested shutdown valves during turnarounds, and even allow observed performance of valves during regular operations to count as tests. All of this was possible because data no longer needed to be in separate silos, but can instead go to cloud-based analytics centers."

Learth adds that preparing data for analysis used to involve manually seeking outliers, miscalibrated instruments, flat data values from halted recordings or communication problems. Now, the solution is applying basic data quality rules and a context layer, which can be done on premises or in the cloud. "Next, analytics are directed by whatever business problem the user wants to solve and how they plan to measure it. This shows if they can use existing, in-house software, or if they need to add new models or an architecture that can address their problem," says Learth. "The initial problem also directs the modeling phase. It shows if a simple or complex model is required; whether the cloud is needed for complex statistical calculations; and if AI, neural network, image recognition or other functions would be useful. A complex model isn't needed of a simple one will achieve the business goal."

Softing

Deane Horn, director of marketing operations at Softing reports that its tManager module is the only appliance that plugs directly into the backplanes of Rockwell Automation’s ControlLogix PLCs, pulls native data from them, lets users select and maps what they want, and cleans and prepares information by other users and applications. It also has connections for sending data to SQL Server, other databases or cloud-computing services.

“Early users looked to data capture for tracking and tracing, such as automotive manufacturers that were at the forefront, and used it to prevent data loss and downtime,” says Horn. “We did hundreds of installations like this, and other industries like pharmaceuticals and food and beverage begin to recognize this, and started using tManager for overall equipment effectiveness (OEE) and improving resource utilization. We don’t do analytics, but all analytics tools need timely and correct information. It’s a big challenge to get data out easily and securely, and that’s what we do.”

Jim Ralston, channel sales director for industrial automation and end-user solutions at Softing, adds that, “Our approach is at the controller level and getting data from them, regardless of what actuator or server the user has. We also have drivers that can talk to other types of controllers, and get data from them. Our easily configurable module serves as a native card in its rack, so it’s seen as just another controller. Users can then choose the conditions for receiving their data based on their individual processes. such as what data types are pertinent and will support their subsequent analyses. We ask the controller for all its tags in Studio 5000 software, and then users pick the packets, conditions and times they want.”

Horn adds, "One really big point is that it’s commonly assumed that if you want to connect a PLC to the enterprise, you're going to need some software coding, protocol translation and, of course, a PC to handle this connection. So, the thing that stands out to users of tManager is the contrast between a PLC in-chassis solution and an external PC-based solution. The PLC in-chassis eliminates software coding, protocol translation and even the PC."

To help cloud-computing services reach PLC-based on the plant-floor, Horn reports that Amazon recently contacted Softing, and they’re collaborating to develop a quick-start process for data access and dashboard development. “One user told us that, instead of weighing trucks, manually entering data, and searching for outliers in stacks of paper, they use tManager to scan the truck and driver’s badge, which automatically identifies any outliers," says Horne. "This kind of digital transformation changes entire work processes.”

TrendMiner

Just as time-stamped data is coming from wider-ranging devices for analytics that can give users more complete views of their processes, business intelligence platforms like Microsoft Power BI and Tableau are also moving beyond their usual record-keeping tasks to examine collections of time-series data (TSD) for the same reasons. "On the other hand, self-service, time-series, advanced-analytics tools enable subject matter experts (SME) to analyze data, before creating dashboards as production cockpits. These tools are tying in laboratory information management systems (LIMS), enterprise resource planning (ERP) and computerized maintenance management systems (CMMS) for more holistic views on production process performance," says Nick Van Damme, product director at TrendMiner. "However, the pandemic required even more users needing access to their data from everywhere. This is because fewer people can be at their facilities, but also because they want self-service analytics that allows operators to do what data scientists can do."

Van Damme reports increasing data volumes and stronger algorithms are letting users investigate more varied production events, and create cross-pollinated business applications and production cockpits. "The actionable dashboards for a range of pumps can indicate if they're running OK or even if their energy consumption is acceptable, but now they can also show if their software was patched by using added links and consistent data," says Van Damme. "The best way to clean data is not having to do it. Many data acquisition (DAQ) platforms like historians do TSD quality checks, but if they aren't available, there are options in TrendMiner for validating information and checking if its a truthful representation. Instead of just trusting readings from hardware, users can add software-based, virtual sensors that build laws-of-nature relationships and confirm logical correlations in data being received."

Van Damme adds that TrendMiner's software isn’t intended to check for initial data consistency, but instead steps into enterprise historians, reads data that's already been prepared, analyzes it in near real-time, performs pattern recognition, and enables more data-driven decisions. "This allows users to analyze large volumes of data without using a traditional advanced process control (APC) model," says Van Damme. "There's been a big uptick in IIoT and other platforms using microprocessors and Ethernet; even sensors can access and store data after they collect it. TrendMiner can work on top of these IIoT platforms and data lakes, using their collected data. On top, TrendMiner's ContextHub software is designed to use information from all additional data sources in the cloud or on premises such as LIMSs, ERPs and CMMSs. For example, batch quality from a LIMS system can be added to the trend data and find the best performing batches, for future reference, and create new contexts for the future."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...