If adding data analytics tools is the first and most crucial task, training users to employ them comes in a close second.
“When I was hired almost 10 years ago, we started adding networking for data systems, and when we were able, we also added Internet-based remote access to each machine that Arthur G. Russell Co. builds. However, many customers weren’t ready to use these capabilities. They told us they didn’t have the infrastructure for the high level of data and, due to the skills gap they were experiencing, we were asked to take on a larger aspect of machine support, so we had to take a step back,” says Brian Romano, technology development director at AGR in Bristol, Conn., which is a machine builder and member of the Control System Integrators Association (CSIA). “Machine builders and operators have expertise with servomotors, PLCs, and vision systems skills, but accessing and analyzing data requires a broader knowledge based and skillset, including networking expertise and other skills related to OT and IT.”
More sensors, better networking
Romano explains that AGR is leveraging several Industry 4.0 practices and principles, while adding Industrial Internet of Things (IIoT) to expose machine data to its equipment, so it can improve their performance and provide better remote support. In general, this involves adding some machine-health sensors that monitor parameters such as vibration and current draw, and not just the usual key process input variables (KPIV). These devices are combined with added networking, which enables data to be exposed, collected, analyzed and applied for added learning. For instance, AGR uses point-to-point IO-Link capabilities to reach components like photoelectric eyes or proximity sensors to expose internal information within devices, such as temperatures, signal quality and other health characteristics. Next, this new data is sent via an Ethernet interface back to the usual PLC network, where it can contribute to monitoring the overall health of the machine and its operations.
“In the past, PLCs just handled process information, but adding formerly ancillary data that used to be off to the side can be very useful. For instance, if a photo-eye is located near a bearing on a machine—besides the usual time series data (TSD) on cycle times, current draw, temperatures and other parameters—we can now look at all the data surrounding a failure, including the temperature from the photo-eye itself to help indicate and prevent future failures,” says Romano. “This can give us signatures and benchmarking for the future. For example, if we find that temperatures, current draw and cycle times are similar to what they were when a previous failure happened, we can use the repeat of these characteristics to adjust, fix or replace components before they fail. This can also help users bring down a process gracefully, and minimize its impact as planned downtime, rather than dealing with an unexpected failure and the chaos of unplanned downtime.”
To help its clients gain the IIoT, networking and analytics skills they need to better sustain operations, Romano reports AGR recently launched a multi-level remote support program. This service takes a client’s collected information such as regular KPIVs and auxiliary data indicating machine health, combines it with embedded information built into their control infrastructure, and analyzes it—onsite if necessary—with software such as Sorba.ai or Aveva Edge to identify trends and anomalies. The program also schedules remote support service hours with AGR’s engineers, and provides clients with Epson’s Moverio glasses that have a video camera and live feed, so onsite and remote users can each see the application and equipment they’re working on, as well as pull up manuals and other support materials.
Insights via eavesdropping
Before implementing a data analytics project, Romano adds that users and system integrators should carefully examine their existing and legacy equipment and processes. A thorough survey all of their sensors, networks and other devices is essential in guiding users toward what they’ll need for analytics and other future initiatives.
“We mostly make medical device assembly equipment, so each station has sensors for managing filling and dispensing, bulk-tank sensing, reservoirs and other processes and operations,” says Romano. “In our case, we need to look deeper into data from these and other functions. While existing sensors can help run basic production applications, more sensors are often needed to get a wider-scope view of our machines and overall processes, so we can address bottlenecks and other issues. This is a real paradigm shift that requires thinking differently and addressing some unusual categories. If a PLC in charge of a control loop shows an unexpected temperature increase, we have to ask what else could that loop and any surrounding sensor information tell us about product quality or equipment health?”
Once existing information capabilities have been audited and upcoming needs determined, users can plan and build their new analytics infrastructure. In addition to added sensors and networking, this includes one of the many shapes in which data concentrators are available, such as historians, embedded PCs, PLCs with expansion cards, and increasingly with cloud-computing services.
“It’s also important not to forget legacy devices from before the 2000s and even as far back as the 1960s. Many may still have crucial data that can be exposed using data acquisition (DAQ) units, PLCs or PCs, even if it’s difficult to reach,” says Romano. “Using interface equipment, such as Banner’s Snap Signal line ‘T’ connectors can be inserted into existing systems that can expose information from equipment that can’t be touched otherwise. They can be used to access data from 1970s-era devices without affecting their machine’s other operations, and then relay that information for analysis.”
Simple to sophisticated
Romano adds that data analytics can be rudimentary, such as determining overall equipment effectiveness (OEE) of machines or processes, or more in-depth by using vibration sensors, load cells or current draw to predict when bearings and motors are likely to fail. However, the role of analytics doesn’t have to stop there.
“If a bearing fails at 9 a.m., users can ask what was happening and what the data showed in surrounding components and systems. Then, when those overall conditions are seen again in the future, users may be able to react even sooner,” he says. “Plus, this is an iterative model that can keep improving by gathering more parameters, adding new data, adjusting itself, and applying new learning. For industrial process and automation systems, this is all that artificial intelligence (AI) really is—a data classifying, modeling, and forecasting system.”