Hargrove Logo Color

Familiarity with IIoT clarifies onsite vs. cloud debates

July 18, 2023
IIoT mini-series—Day 10—System integrator Hargrove Controls & Automation shows how to resolve virtualization and simulation, and on-premises and cloud-computing questions

Once new technologies like the Industrial Internet of Things (IIoT) become more familiar to users, they go from being add-on novelties to designed-in cornerstones, which also makes them simpler to implement and easier to generate benefits.

“Virtualization has been the default architecture for most operator HMI systems for quite a few years, with only the most safety-critical stations deploying thick clients. Within the last few years, there’s been an increased focus on designing operations technology (OT) infrastructures to account for smart, IIoT edge devices and cloud computing,” says Alan Polk, OT and cybersecurity leader at Hargrove Controls & Automation, a CSIA-certified system integrator in Mobile, Ala. “Many companies are starting to increase their focus on the OT network during front end engineering to ensure that required pathways for data to flow from the cloud to the edge devices are built into the process from the foundation with security and reliability in mind.”

Heath Stephens, digitalization leader at Hargrove, reports, “I think the cost advantages of virtualization are well recognized at this point. The computing needs for operator and engineering stations for a standard PLC, DCS or SCADA deployment don’t tax a standard Microsoft/Intel box. Virtualization allows companies to manage multiple, virtual PCs more easily with a much smaller hardware cost and improved resilience. 

Some obvious advantages

IIoT allows us to bring in process data that may not have been cost-justifiable for a more traditional PLC/DCS platform using 4-20 mA wiring or I/O bus technology. In addition, digitalization lets us create digital twin simulations that give us better insight into how our processes are performing, and better predict what may happen next. It also improves our ability to share data between areas of operation that were more siloed in the past, like maintenance and resource planning.”

Polk agrees that IIoT’s ease-of-use subsequently enables analytics and simulations that weren’t possible before. “Industrial processes are incredibly complex, and optimization on a real-time basis is beyond the capabilities of even the most advanced PLC or DCS system,” explains Polk. “The process industry was aware of this decades ago, and shifted towards complex, model-based systems for advanced process control (APC), which sat above the PLC/DCS level and provided tuning adjustments. With the recent capabilities of machine learning (ML) to analyze models with input and output matrices tens to hundreds of times larger than traditional APC platforms can support, the opportunities for cost reduction, yield increases and quality optimizations are almost endless. Unfortunately, the cost of deploying and maintaining servers that can handle these types of large analyses push return on investment (ROI) in the wrong direction, making it harder for companies to justify the investment. Cloud computing helps to move the ROI back in an affordable direction by allowing a central large analysis system to service multiple facilities.”

Advocating for change

While IIoT has the potential to make virtualization and other applications easier, Polk and Stephen caution that users must focus on their own requirements to filter all the emerging suppliers and products—and also sell reluctant colleagues on adopting them.

“There’s been a bit of a goldrush into Industry 4.0 space by a lot of companies, many with great ideas and products. However, that has also brought about a lot of competing technologies and communication protocols,” says Stephens. “There’s also still a large installed base of existing control system platforms, using older technologies that we often try to integrate alongside newer IIoT devices.”

Polk adds that, “Very few facilities would benefit from less data collection and analysis. There’s almost always room for improvement. Companies need to evaluate their appetite for pursuing optimization, as well as their willingness to accept change and take on risks. A large contingent of the industrial sector is very risk-adverse, and on the operating floor, change is usually not welcomed with open arms. Plant operators and supervisors have been operating facilities in a similar manner for decades. If the current status quo provides a safe and reliable atmosphere, change is a hard sell. Companies must be willing to push change down to the plant-floor to take advantage of the power that IIoT-based edge devices and cloud computing can give them.”

Onsite or in the cloud?

To figure out how much computing to do on the edge and how much to send to the cloud, Stephens reports that users can employ a probability/severity matrix (similar to deciding when to use an independent protection layer) to determine if a function needs to remain on premises or if it can move to the cloud.

“In almost all cases, any calculation associated with the safety of the facility will need to be handled on premises, while setpoint changes that optimize yield and cost can be moved to the cloud if there’s a computing-horsepower advantage,” says Stephens. “On-premise and edge computing typically provide lower latency, more direct connectivity and higher reliability. However, you may have to sacrifice computing power, storage capacity and capability. This is where the cloud can provide benefits. It’s also important to remember that. no matter how advanced and powerful cloud and other Industry 4.0 technologies may be, they depend on reliable data to be successful. This often comes down to the basics, such as well-engineered and maintained instrumentation, reduction of manual data entries and errors, and reliable system connectivity.”

Cloud concerns and solutions

Stephens confirms that Amazon Web Services (AWS), Microsoft Azure and other cloud services are emerging in plant-floor applications that combine edge and cloud functions, which is stoking worries about cybersecurity, too.

“While still used in a minority of processes today, successful use cases by AWS, Azure and others are building momentum. The two major categories are where users are using ‘raw’ AWS and Azure services for computing and storage, and where users utilize a cloud-based software as a service hosted on major cloud provider’s platform,” explains Stephens. “The primary fear clients voice to me is cybersecurity. While certainly an area to address, cloud platforms are largely as, or more, secure than clients’ on-premise installations. The more important risks to address are often related to latency, cost, and what to do in case of a system/connectivity outage.”

To further optimize virtualized, cloud-based, IIoT-enabled and mobile applications, Stephens reports that cloud services typically have built-in and third-party tools to optimize monthly costs, which are usually based on CPU and storage usage.

“Clients also need to adopt tools that allow them to easily monitor the health of these systems,” adds Stephens. “For example, virtualization is well-supported by most control system vendors at this point, usually with a wel-specified deployment strategy specific to VMware or MS Hyper-V. Hargrove has performed many of these installations. Our cloud deployments include setting up cloud-based servers and specialized cloud services like Noodle.ai and Imubit. Our IIoT and edge solutions have included systems like AWS Monitron and Ewon remote access. Thanks to greater availability of ruggedized and hazardous area classified devices, and better wi-fi and 5G coverage solutions, we have also implemented tablet and mobile phone-based MES solutions.

Stephens reiterates that users must be clear about the problems they’re trying to solve. “Implementing these technologies may provide a platform or stepping stone for other future goals, but it’s important to have a well-defined and attainable scope for any initial projects,” he says. “Also, make sure that the underlying data and systems are reliable. Advanced solutions based on faulty data, unreliable connectivity or a disengaged workforce are doomed to fail.

 “Finally, since evolution is constant without a final endpoint, all of these technologies will become more commonplace, and will be considered at the start of all future projects as part of a standard implementation. Cost and complexity will continue to be reduced, and the role of process control engineers will continue to expand.”

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...