Two of the most frequently mentioned technologies related to data analytics are artificial intelligence (AI) and machine learning (ML). However, though both have the potential to help data analytics in process applications, the jury is still out on how much AI and ML are actually doing it so far.
"We hear a lot about AI and ML, so much so that the joke is whenever data analytics is deployed, everyone is calling what they have 'machine learning' because everyone else is wowed by it," says Will Aja, customer operations VP at Panacea Technologies Inc., a CSIA-member system integrator in Montgomeryville, Pa. "However, they're all creating and doing the same thing because at their core, AI and ML are just applying algorithms to a data set, and many people are creating code to show dashboards of data and calling it AI or ML. The problem is many users are buying 'ML solutions,' but finding out what they actually needed wasn't so simple because their data needed to be cleaner and better aggregated."
To get beyond buzzwords and achieve useful solutions, Aja reports that good data analytics must employ a program that's prepared and established ahead of time. "If you have 1,000 devices to get data from, you need to begin with a good model and a central data storage area. It won't help to just throw a data analytics system on top," he explains. "Likewise, if you want temperature measurements from two mixing tanks, but they've been using different ways to set up reporting, they need a uniform way to collect their data, take it from their historian for analysis, integrate outside data sources, and make better decisions with less human involvement."
Aja states a new data collection and analysis plan requires users to decide not only what data to collect, but what it will what look like, what tag structure to use, how it will flow, who will see it, and what reports to generate. "It's bad strategy to collect all data without a plan because it will become noise. An example would be collecting data on an unchanging temperature every second without justification for why or how the data will be used," Aja says. "If a user has an existing application infrastructure, it's not too late to start a data analytics plan. However, because it's very difficult to tackle a plantwide or global analytics program, a feasibility study can determine what data users already have, what else they want to collect and how to adjust their infrastructure to get the data they want."
Aja adds that it also helps to bring all available users into a data analysis project, and implement a pilot analysis program in a small operational area. This pilot can be a four- to six-week study of an application, which evaluates existing operations, network architecture, needed data and how to collect it, and other requirements that may be identified. "It's important to talk to people about what they want from data analytics, what reports they need, what they hope to accomplish, and what they'd have if their plant was new," Aja says. "If they can identify their current and desired states, and how to get there, it will help determine what the pilot study should include. Small steps are how you build a larger picture. You can't just jump into a site-wide data analytics deployment. Even if you tried it, you'd just end up with more of the same, old reports.
"In addition, it's easier to get the processing power that data analytics and optimization projects require by using cloud-based services like Amazon Web Services (AWS) because they're easier to build and maintain than trying to do it on your own. In the future, the big connection and gains come when data analytics goes full circle, users start to benefit from contextualized information, and their systems begin to change on their own. This is where AI will begin to go beyond the usual setpoints and alarms, and get into real-time control. However, it still will be based on someone creating the algorithms that apply to their situation, and their job will go from running their plant to truly optimizing it because they'll be free to see what they can do differently."
Yota Furukawa, head of the New Field Development Center at Yokogawa Electric Corp., adds: "Many data analytics methods and tools were developed for use in the process industries to solve various problems, but there are several problems that classic tools and methods can't solve. As ML and AI technologies were developed in the IT industry, they were expected to address these issues, but our observations show they can't be simply applied to process applications. However, combining ML, AI, plant domain knowledge and first principle modeling recently and gradually became more sophisticated, and can now solve some of these difficult problems. We think this combination has triggered an evolution.”
Furukawa reports that Yokogawa provides solutions for anomaly detection in systems of assets such as motors and compressors, optimization of plant processes and root cause analysis of past failures. They're implemented using a portfolio of data analysis methods leveraging Yokogawa’s proprietary domain knowledge. Using ML/AI, anomaly sign detection of a compressor is one example described in Yokogawa Electric Report, Vol. 60, No. 1, p. 35-38, in the section titled “Compressor System State Diagnosis and Failure Sign Detection."
"We think no ML/AI or other methods can solve customers’ problems well unless they are applied using domain knowledge," Furukawa adds. "Instead, ML/AI should be used to leverage human domain knowledge for problem solving. We think the effective integration of domain knowledge with the latest technologies is the most important concept or approach to data analysis."
Like this article? Sign up for the twice weekly Control Update newsletter and get articles like this delivered right to your inbox.