Cool-savings-in-context-250
Cool-savings-in-context-250
Cool-savings-in-context-250
Cool-savings-in-context-250
Cool-savings-in-context-250

Knowledge on demand - Part 1

May 21, 2018
Tapping into better, faster, more useful process information—for wiser decisions—can be easier than you think.

Read Knowledge on demand - Part 2

We all know reach can exceed grasp, but at least in accessing and using process control information, many longtime hurdles are shrinking. Finally, knowledge-on-demand is closer at hand—physically, if not always mentally.

The old chasms were often huge between plant-floor operations and other applications, such as utilities where production data is generated and central locations where users make decisions to improve their processes. Historically, this is because reaching many remote signals and parameters on legacy equipment was difficult, and establishing the networking and programming needed to get information back to users could be even harder.

Fortunately, many tools, software, developers and system integrators are whittling down these obstacles—and simplifying and flattening networks—until it's often just a short hop or two to get data and know-how where it can do the most good. In fact, recent technical advances in single-pane displays, Internet-based network simplification, and mobility enabled by tablet PCs and smart phones have progressed so much that one of the biggest remaining snags is getting potential users to realize that many former barriers to accessing useful data are no longer there.

"Using big data for smart manufacturing is a fabulous concept, but it can be held back if we're dealing with a 50-year-old paper plant with a 35-year-old historian, which is why we need to make certain the information we're getting is correct," says Tim Gellner, senior consultant in the operational consulting group at system integrator Maverick Technologies, a Rockwell Automation company. "There are many handhelds for data entry and acquisition, and some that can do control, including one we designed and built several years ago for the three field operators at a small, electrical peaker plant in California. We did a pilot with Apple iPads, but rolled out with Microsoft Surface tablet PCs running Windows to connect to the control system."

To be ready for dips in the local grid, the plant has four GE gas turbines, which must go from a dead stop to moving in a matter of minutes. They’re controlled by GE Mark 6E and Cimplicity software. The balance of the plant is controlled by Emerson’s Ovation DCS hardware and software with recently added wireless components, as well as Splashtop remote access software for desktop sharing, which marries the operator stations to laptops or tablet PCs.

“This allows the operators to be untethered from the control room and connect to the two sets of HMIs on the turbines and balance of plant to receive alarms, perform control and troubleshooting functions, and monitor the demand requests from the utility,” explains Gellner. “Security is achieved by adhering to the NERC-CIP’s standards for using wireless onsite in critical infrastructures. The plant also uses secure log-ins and two-factor authentication. Remote sessions are encrypted with TLS and 256-bit AES. All process data is collected natively by the control system, and though the operators can make setpoint changes via the tablets, they’re mostly used for monitoring. The tablets are also easier to set up because, in the past, the plant’s few, remote operator stations and their drives had to be accessed in code and manually, which required a lot more labor.”

Close-in reasoning

Users, integrators, developers and suppliers have individual motivations for getting closer to their data, but there are some common themes as well.

"We see two main forces at work. Decision makers and managers are concerned with profit-and-loss statements, and need better information access to show clients the short-term impact of what they're doing," says Youssef Mestari, program director, Connected Plant, Honeywell Process Solutions. "The second driver comes from plant-floor technicians, operators and engineers, who need to get the right data to anyone at anytime. If it's like their mobile phones, and makes their lives simpler and easier, then they'll adopt it."

Mestari adds that upcoming data access solutions will take three main forms:

  • Intelligent, wearable devices that can provide real-time access to process data;
  • Immediate access to remote experts, who can see what operators see via onsite cameras, and provide guidance; and
  • Improved guides and step-by-step procedures, so less-experienced personnel won't have to remember every detail about their processes, and can share details more easily with their maintenance teams and managers to identify abnormalities and avoid shutdowns.

For instance, Kuwait Paraxylene Production Co. (KPPC), a subsidiary of Kuwait Aromatics Co. and Petrochemical Industries Co., will use two Honeywell Connected Plant services to improve its Continuous Catalytic Reforming (CCR) Platforming and aromatics complex, which produces paraxylene for plastic fibers and films at its Shuaiba petrochemical plant in Safat, Kuwait. KPPC will deploy Connected Plant's Process Reliability Advisor software for ongoing monitoring, early event detection and performance issue mitigation, and deploy Process Optimization Advisor that continuously monitors streaming plant data and applies Honeywell UOP process model to determine the most economical operating mode.

"Information access is transforming how many technicians and operators do their jobs," says Mestari. "Instead of manually gathering hundreds of measures during scheduled rounds every few hours and not consulting them until an incident occurs, they can use Connected Plant to learn about production performance and equipment health in real-time, retrace value captures and immediately compare to in-spec or out-of-spec ranges."

Tim Goecke, director of Maverick's enterprise integration practice, adds that, "Our two main engagements are with people who have a problem they need to fix and those looking at their 'plant of the future' and how the Industrial Internet of Things (IIoT) can help manufacturing. There's a lot of curiosity because they want IIoT for more accurate forecasting, better planned maintenance and foreseeing breakdowns. However, we've learned from interviews, workshops and discovery meetings that their data infrastructures aren't there yet. They typically have historians, SQL servers or databases with transactional, batch or SKU information scattered across multiple data silos. From a process data perspective, they usually have legacy historian systems in place that haven't had any structure or contextualization layer applied, making them encrypted to the larger data consumer community.

"That's why we promote strong data structures based on ISA's S95 and S88 standards, using tools like Rockwell Automation's FactoryTalk Historian, OSIsoft's PI or AspenTech's IP21, which structure data with contextual layers. And, many people are starting to get it, and coming to understand that data, access, readability and fidelity are crucial to enabling future plant driven by data, which they must be able to access and read."

Gauging the gaps

Whatever the opportunities and potential rewards, however, it's still crucial to look before leaping. So, developers, system integrators and users must examine how to make information more accessible in their individual applications.

COOL SAVINGS IN CONTEXT

Figure 1: Evaporcool added Seeq data analysis software running on remote workstations to organize large volumes of sensor data from evaporative cooling units that mount on and assist high-capacity HVAC units. The data is integrated into an energy-consumption model they developed that calculates how much revenue the coolers save in real time, and has graphical look-backs that show pockets of savings.

"We work at all different levels in manufacturing, from the plant floor all the way through to business systems. We've seen enterprise/production links began to get closer with the advent of application program interfaces (API) that made it easier for processes to talk to each other," says James Ruiz, COO at ITG Technologies, a member of the Control System Integrators Association (CSIA) in Jacksonville, Fla. "The process control market is moving away from top-down solutions from one vendor to instead creating niches that can be filled in by players up and down the line. When users need big data, no one wants to be the bottleneck, and so everyone has to be able to talk to third parties."

To assist its users, ITG implements its flagship SORBA IIoT enterprise platform, which applies machine learning and predictive analytics within four clicks to provide actionable information to users more quickly and simplify their ability to automate, monitor and control their processes. ITG reports that it uses SORBA platform with its because it eliminates costly data science agreements or capital expenditure approvals. Similar IIoT platforms include GE's Predix and Siemens' Mindsphere, but implementing SORBA and other remedies still requires evaluating each user's application.

"The lead question is what's the biggest pain point? What 20% change will give an 80% return?" asks Ruiz. "After identifying the low-hanging fruit, the second step is asking if the user has the data they need? If not, we design an instrumentation platform to extract and collect it, and get it to them. Some users are worried about cybersecurity when going to the cloud, but we can show how to do it safely with secure protocols. This is important because we're also going to be doing more edge computing soon. Users are learning they don't want to send everything back to the cloud if they don't need to."       

Peter Martin, vice president of business innovation and marketing at Schneider Electric, explains that there are two main types of knowledge that users require: knowledge they should on an ongoing basis to keep their applications "on the road," and knowledge for making discrete decisions. "The operator's job is to create profitability safely, but most don't know if changing a particular setpoint will create value or destroy it, so they're often left to do what's comfortable," says Martin. "However, the business context is moving down to operations, and showing that operators, maintenance personnel and engineers need more information that can support profitability, such as verification for engineers that their control strategies are correct." Schneider Electric's three primary tools for safely boosting profitability are Profit Advisor, Control Advisor and Maintenance Advisor software, which examine operations in real-time, but then allow different plant groups to work together.  

In-context data saves

Because the primary aim of data access isn't just applicability but also speed, Evaporcool in Memphis, Tenn., recently worked with Seeq Corp. to organize large volumes of sensor data coming from its evaporative coolers, which reduce operating costs for high-capacity HVAC compressors for large buildings. They apply evaporative cooling to air drawn through the condensers' coils, which increases the capacity of the HVAC units, and reduces the electricity they need because they don't have to run fully loaded to provide the same cooling level.

However, Evaporcool's automation and savings calculations initially relied on sensor data and measurements that historically weren't detailed enough or didn't go back in time far enough to provide a baseline, according to Chris Curry, president of Evaporcool. The sensors monitor condenser air inlet temperatures, outside ambient conditions, compressor running time and current draw, and their raw data is processed using equipment and performance models to calculate savings.

"We worked with Seeq to create an operational model of the system, and show how it performs based on data from a variety of installations, combined with heat transfer and air conditioning efficiency constants," explains Curry. "Data from all the sensors is fed via the Internet into Seeq's data analysis software running on remote workstations."

The model takes relevant parameters, such as temperatures and humidity, and builds a picture of what's happening in real-time or at any other operational history point, compared to what would be happening without the evaporative cooling system. When supplemented with actual historical data for a given installation, the model results are even more specific. The energy consumption models have proven to be highly accurate with an R-Value of .98.

"This with-and-without cost comparison is a major benefit for users," says Curry. "This makes it possible for us and our building-manager customers and their remote users to monitor and measure all aspects of the evaporative cooling system’s performance via the Internet, see savings in real-time, follow a summary graph back in time to identify pockets of savings, and establish upper and lower operating limits, so any crossing of the line can send an alarm." (Figure 1)

"Data, remote or otherwise, might be interesting for users to see, but to be useful, it’s more important to see data from any source in the context of other datasets or plant context," says Michael Risse, VP and CMO at Seeq. "This is the leap now underway—from a trend line in a browser on any client, which indicates access to raw data, to a contextualized, informed insight on which users can make decisions for improvement."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.