CT1908-Cvr-image2

Finding the value of AR, VR in process automation

Aug. 15, 2019
Identify situations where AR and VR can generate value to justify implementing it.

Check out the AR/VR mini series

To avoid the flash-in-the-pan fate of previous 3D simulations and human-machine interfaces (HMIs), developers of today's augmented reality (AR) solutions emphasize that users must identify situations where it can generate value to justify implementing it.

"It's cool to have videogame-style graphics and content, but just 'seeing' isn't where the value is. The question is what actionable data does AR provide, and can it provide it in the context for work that's being done, such as where users are, what they're looking at, and what they're trying to accomplish," says Aaron Crews, modernization director at Emerson. "This can help users with tools to support them in the right time and place, be safer, and close the loop faster to get tasks done."

Anna Velena, AR product manager at Emerson, adds that one of AR's primary advantages is that it can provide data to users within the context of a specific assets or application to:

  • Improve the situational awareness of field workers by understanding what's around them;
  • Deliver knowledge on demand based on relevant information: and
  • Provide live remote assistance, including securing expert input if needed.

Peter Richmond, portfolio director for extended reality (XR) and 3D visualization at Aveva, adds: "Users love stepping into VR environments for training and design reviews, but they must also become part of a company's overall digitalization strategy. This means XR must solve problems, instead of just being a fancy toy. Achieving value will reduce the barriers to XR's acceptance."

Richmond reports that modern equipment, operations and plants are designed with CAD/CAM software, and Aveva's XR software can turn these original designs into immersive environments. However, because plant situations change and data veracity must be updated, XR also needs to work with laser scan software such as AvevA Light Form Modeler (LFM) software. "A laser scan of an existing brownfield site can create a digital twin of the equipment or facility," says Richmond. "This serves as the foundation of a living digital twin."

Start small, then scale up

Because AR and its related realities are unfamiliar for many potential users, those who are slightly more experienced recommend playing with its components to learn how it works and experimenting with a few small projects to see where it might work best in your individual applications and facilities.

"When customers ask about AR/VR, I make sure they understand the difference between them, ask which one they think they need, and what problem they're trying to solve," says Kim Fenrich, digital services product manager in the Industrial Automation business at ABB. "Identifying a problem to solve with AR and seeking immediate value is important. It’s good to have a big strategy, but you shouldn’t try to boil the ocean by developing a 3D model or AR application for every piece of equipment. Once the problem and the tool to solve it are identified, a small-scale deployment should be done to prove initial assumptions, and learn what's not known. You need to expect that, at some point, a leap of faith will also be required to invest in education, scale up, and to establish functions like remote expert support and guided maintenance procedures."

Rashmi Kasat, vice president and head of digital business development at Metso, adds: "As with any new technology, the best way to understand AR is to try it. If you have a clear use case for AR in mind with benefits, develop a lean concept and prototype; go out in the field, ask potential users if and how they might use it; validate the concept; and if it makes sense, make some investments. Even in latter phases, you can pilot the concept with users, learn pros and cons, and make decisions about scaling up. This is what we do in Metso’s new Digital Garage.

"At the same time, it's important to be sensitive to different needs of different people. New technologies like AR are good, but we shouldn't push them in the face of people. It’s important to involve diverse user groups including potential early adopters and the skeptics right from the concept stage until new AR/VR apps are launched. This way we learn all the challenges and opportunities from its potential use and application. This will feed into making usable designs and make adoption easier. People and user centricity is key for successful adoption of new digital technologies.”  

Seeking use cases

Even though AR is still in the early adopting phase in the process industries, its supporters report there are many use cases where it can assist process operations.

"Because there are so many ways that AR/VR and mixed reality can be applied, users must start by identifying a business problem, a pain in the neck, or just information that needs to be collected to help decide what AR or other solution to use," says Ken Adamson, vice president for PlantSight at Bentley Systems Inc. "Because AR can use data from the cloud, it can do fleet-level functions by capturing real-time data from many assets or facilities in varied environments that need an assist at the local level.

"For example, Shell is building a 450-acre chemical plant near the Ohio River and Pittsburgh. It has contracted with Eye-bot solutions to fly over the site twice a week, using drones and our reality modeling software to track construction and check for unsafe situations. Shell is also using our flood simulation software with the continuously updated reality model to determine what would be impacted first, and show pooling after heavy rains to indicate if any equipment needs to be moved."  

Ronnie Bains, director of Emerson's process simulation and digital twin group in Europe, adds that Emerson starts with a digital twin, which it defines as a software-based representation of a process or production facility. "We refer to VR as an immersive environment for training and operations support, all enabled by the digital twin under the hood," says Bains. "In the real world, field operations personnel interact with field equipment, such as a manual valve, and need to locate it, and take action to maintain or operate it. In a VR setting, they see the corresponding valve and the effect of changing processing dynamics, and see what's going on inside the valve with data from the digital twin. This is why digital twins must be first-principles-based, so they'll be an accurate reflection of what the physical process is doing.

Figure 1: Noah Greene, mechatronics apprentice at Phoenix Contact, uses a Microsoft HoloLens to demonstrate an AR-aided pilot project for compressor controls. AR let users see indicators like machine cycles, performance history and energy use much faster. Source: Phoenix Contact

"After that, VR can be expanded to create a larger immersive environment, using existing 3D CAD files integrated into the VR solution, and enabling users to perform different activities. Technology has evolved dramatically over recent years, so these days it isn't hard to develop routine operations into VR for subsequent training. Other use cases include crisis management and safety training, commissioning and support, planning shutdowns and turnarounds, and managing asset health."

Though there are limited implementations in the process setting so far, Bains adds there are several use cases for AR/VR, such as:

  • Offshore oil and gas platform personnel could employ AR to better understand outside operations, carry out procedures, view and work with remote processes, and interact with enterprise-level colleagues;
  • Life science technicians, who must gear up to work in clean rooms, could carry out many formerly manual tasks with AR/VR by recreating more of those tasks in virtual formats. This could also aid material tracking and training; and
  • Reduce risk during equipment changeouts by using VR environments to determine if new devices, layouts and routes will fit in existing spaces. This will also reduce shutdown times and reduce costs.

"Diagnostics, maintenance and root cause analysis will be the one of the more popular use case scenarios," adds Enrique Herrera, industry principal for manufacturing at OSIsoft. "With AR, you can see the insides of processes and equipment that otherwise you’d have to take apart. Also, you can overlay real-time data onto the real world. Let’s say you have a utility with a gas leak. The gas might be invisible, but with AR you could create a simulated model of the escaping gas based on the available data. Technicians and safety crews could thus 'see' a cloud of the gas, get a live data stream of its chemical composition and flow rate of the lead, and at the same time, do their job in a much better way."

Similarly, Dave Skelton, vice president of development and manufacturing at Phoenix Contact, reports it's developed and demonstrated two pilot AR applications, one for managing compressor controls on shrink-sleeve equipment at its U.S. headquarters in Harrisburg, Pa. and the other providing visualization of control assets at point of installation in Building 4 on its campus in Bad Pyrmont, Germany.

"The project in Germany retrofitted the facility, using the Niagara protocol platform on the PLCNext controller, which interfaced multiple protocols like Profinet and Modbus," explains Skelton. "The project allowed the combination of control of multiple control platforms found, not only traditional HVAC, but all energy, security, fire protection and lighting controls. AR was used to show the maintenance staff using targets on their control cabinets, maintenance drawings and safety procedures.

"In Harrisburg, the technology project originally connected our PLCNext to the cloud for data collection, but the team also used AR to display information. It was immediately apparent that AR could quickly show indicators for machine cycles, performance history, voltage/current flows and energy use, and serve on a HoloLens to visualize values for operations at the machine (Figure 1). It can also show the equipment's accessible work range, and if it's in or out of work range. Recommendations if it's out of range, such as how to fix it and what parts might be needed, are future additions being considered. This was big for maintenance because AR can show information that doesn't have to be searched for anymore."

Matt Klinepeter, lead web developer at Phoenix Contact, reports that live production data is shown on the HoloLens, but first it's read into a Node-RED to HTTP server that's hosted on the PLCNext controller. This system is networked using regular HTTP protocol, but it's programmed to provide data by C# software and a list of API dependencies.

"Up to now, because of computer gaming and its community, there's more VR technology than AR," adds Nathan Raupach, web services manager at Phoenix Contact. "However, AR's advantage is users can tie in many technologies and products they're already using to try it. You just have to open-minded. Traditional HMIs have been serving the process industries for 30-40 years, but unplanned downtime still causes lots of lost revenue. AR puts an overlay on existing assets, so it can be used in many legacy and retrofit applications to get data out of file cabinets and off hard drives. It lets users update as they go, and they can immediately see its value of no longer having to search so much for documentation and data."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.