Simpler-suds-in-southern-africa-250
Simpler-suds-in-southern-africa-250
Simpler-suds-in-southern-africa-250
Simpler-suds-in-southern-africa-250
Simpler-suds-in-southern-africa-250

Knowledge on demand - Part 2

May 28, 2018
Tapping into better, faster, more useful process information—for wiser decisions—can be easier than you think.

Read Knowledge on demand - Part 1

Putting fingertips to work

Of course, once quicker access to data is gained and better decisions are possible,  some users are also confronted with unexpected capabilities and new roles based on that enhanced information and perspective. 

For example, Wood, a global, vendor-independent system integrator, reports that many of the 1,300 automation and control professionals in its Automation and Control division develop real-time operations software and provide support services. The software and services are used to monitor key performance indicators (KPI) and overall performance, many times running models or "digital twins" in parallel with physical processes. Examples include air and water emission applications in the oil and gas industries, which are increasingly using secure networks to quickly link operations monitoring software like Wood's to other systems via services, such as Virtuoso. Wood Group, Mustang and recently acquired Amec Foster Wheeler are all part of the UK-based Wood.

"Because advanced process control (APC) generates revenue by using production data to minimize resource use, much of our group works on process databases for emissions monitoring," says Hodge Eagleson, APC and senior principal technical consultant at Wood. "With today's proliferation of data and databases, it's possible to take additional indicators, find trends, and use them, too. The innovation in APC from more accessible data is—instead of just controlling one process—extending the benefit from monitoring historians and KPIs to telling more about uptime, and showing how well those benefits can be further maximized by running against constraints. Our solution is model-based control, which predicts the future and adjusts early to it. This goes beyond feedback-based control, which can't respond until it sees results."

Similarly, Brian Burgio, account manager for advanced solutions, software and APC at Yokogawa Corp. of America, confirms that model-predictive control (MPC) and APC have not only been gaining speed, but are also enabling pretests and checkouts, tuning loops, aiding valve and equipment repairs, and providing speedier intelligence to remote monitors.

"In many cases, users no longer need to be physically there," says Burgio. "We've partnered with Shell on multivariable (MV) control since 1999, and helped develop its MV control package, Platform for Advanced Control and Estimation (PACE), which was released to Shell in 2015 and to the overall market a year later. In fact, Air Liquide just standardized on PACE in North America. It's a comprehensive tool that combines MV functions, is web-based to better display operations on HMIs, and helps run plants in three ways: broad plant operations such as manipulating setpoints, performing advanced regulatory control such as adjustments based on ambient conditions, and improving MV control so less-experienced engineers can do it."

Single pane shifts jobs, too

Ironically, as systems simplify and knowledge gets closer to users, the roles of those users and even their familiar devices are also evolving.

For instance, Namibia Breweries Ltd. (NBL) in Windhoek, Namibia, recently implemented a 1-megawatt (MW), roof-mounted solar plant with more than 4,000 panels, 66 inverters and four cluster controllers, which are connected to three of the brewery's generator areas, and make NBL electrically self-sufficient. This project and managing its utilities prompted NBL's engineers and managers to better coordinate some of their departments and organize them into a cohesive entity that could make real-time business decisions based on one version of the facts.

SIMPLER SUDS IN SOUTHERN AFRICA

Figure 2: Namibia Breweries Ltd. integrated data from 4,000 panels, 66 inverters and four controllers in its 1-MW rooftop solar plant by using Wonderware Historian from Schneider Electric and a virtual enterprise on two TOP servers from Software Toolbox to coordinate the utilities with other brewery operations that are controlled by more than 100 PLCs, and display KPIs results over a simplified network that includes an HTTPS dashboard server. Source: Schneider Electric

"We have the brewing, packaging and distribution departments, and each focuses on doing their jobs to the best of their abilities, but without necessarily much concern for the common denominator that makes it all possible—utilities,” says André Engelbrecht, industrial control systems manager at NBL.

This unification included NBL's CO2 plant, ammonia cooling system, boiler house, water treatment and sterile air plants, and power meters. NBL has a central DCS that controls its brewing process, but found it needed more immediate data and collation resources—a living system—to achieve its present and future goals of accurate decision support based on reality and real-time production information. Engelbrecht reports that NBL implemented Wonderware Historian and the scripting capabilities of Historian Client, both from Schneider Electric.

NBL configured a virtual enterprise consisting of two TOP servers from Software Toolbox to balance the facility's load of more than 100 PLCs and other systems, one DataHub server, the Wonderware Historian, a main historian data warehouse and a web server. A secure HTTPS dashboard server lets managers view daily and monthly sales and operational KPIs from anywhere, and weekly, real-time stock volumes are sent to NBL’s advanced planning system using Historian Client queries.

“We installed TOP server to retrieve data from our utility plants and systems, and used the DCS to build a SCADA system," explains Engelbrecht. "We then developed a web-reporting system for production personnel and a dashboard system for management. Most of our physical servers are now hosted in a virtual environment, and this made things a lot easier, such as time synchronization between the old Historian and the OPC server.” (Figure 2)

NBL's simplified data system lets qualified staff view utility consumption and production information at the same time from anywhere. They can also view daily, weekly and monthly consumption information on the same platform. In the future, NBL reports it will be able to switch off non-critical plant equipment to ensure that maximum demand remains below target.

Thanks to its coordinating operations, Engelbrecht reports that NBL:

  • Meets CO2 sales targets by optimizing sales versus storage capacity and use;
  • Complies with water-savings rules required by Windhoek and local NamWater utility;
  • Saves electricity with maximum demand implementation;
  • Improves solar plant operational effectiveness;
  • Enhances fault-finding with Historian Client and video playback functionality on the central DCS;
  • Improves decision-making about plant requirements, such as historical thermal energy data that NBL to reduce their new BioMass boiler requirement from 8 MW to 5 MW;
  • Views consumption data in conjunction with production information to improve loss control;
  • Verify KPIs of new plant and equipment, which makes Historian Client critical to NBL’s business; and
  • Achieves more accurate calculation and reconciliation of project KPIs and ROIs.

Combine, multitask functions

Beyond seeking a single-pane for multiple data streams, many developers and users are merging monitoring and control functions with increasingly software-based tools to remove old barriers and get more and better information in front of users faster.

For example, U.K.-based Anglo American plc's Minas-Rio project includes an iron ore mine and enrichment unit in Conceição do Mato Dentro and Alvorada de Minas in the state of Minas Gerais, a 529-km pipeline that runs through 33 municipalities in Brazil's states of Minas Gerais and Rio de Janeiro, and an iron ore terminal at Puerto de Açu (Figure 3). However, these scattered facilities lacked data integration and operational standardization between them, so Anglo American enlisted Rockwell Automation and system integrator IHM Engenharia, part of the Stefanini Group.

ORE MOVING ASSETS

Figure 3: Anglo American's Minas-Rio iron ore mine, enrichment plant, pipeline, filtration and port in Brazil use a PlantPAx control system with integrated asset management to standardize and coordinate operations and generate graphs and reports more quickly. Source: Anglo American and Rockwell Automation

Together, they combined Minas-Rio's controls into four main parts, including the mine and enrichment plant, pipeline, filtration plant and port, each with its own controllers, servers, control room and web-based operating stations. In all, as the distributed control system (DCS) handles 20,000 instruments, 800 motors and 1,500 intelligent instruments networked via Profibus PA or HART, the asset management system, implemented in all parts of the DCS, assists parameterization of the smart instruments, while the Plant PAx control system gets the right data, graphs and reports to users faster by integrating more closely with the process information management system (PIMS) and the manufacturing execution system (MES).  

Likewise, Aimee Xia, product marketing manager for DAQ and control at National Instruments, reports that NI and Innovari Inc. collaborated recently to help utilities optimize their energy grids with edge control and cloud analytics. "Innovari’s Interactive Energy Platform (IEP) uses artificial intelligence, big data analytics, proprietary optimization routines and grid-edge hardware built on NI technology to deliver capacity and address grid demand," explains Xia. "With this technology, utilities can focus edge-of-grid resources as opportunities, rather than threats, and users as partners, rather than just bill-paying customers. Deploying IoT systems like Innovari’s IEP provides real-time, deep situational awareness into the grid, helping to reduce the likelihood and duration of power outages to optimize performance."

Innovari’s grid-edge hardware, called Energy Agents, attach to participant buildings to deliver important two-way communication of energy information back to the utility, and balance edge-of-grid resources for an optimized grid. The agents are designed with CompactRIO single-board controllers and LabVIEW software for distributed sensing and control.

For instance, NI reports that Kansas City Power & Light uses IEP to gain real-time, deep, situational awareness into its grid, reduced the likelihood and duration of power outages, and gained more than 400 hours of operation per year with the 15-MW project that engaged more 200 commercial and industrial customers. Likewise, Mumbai-based electric utility Reliance Infrastructure (RInfra) implemented IEP on commercial buildings and mobile generators to stabilize its grid in stressed regional areas. Since 2014, RInfra has responded to hundreds of grid events using IEP, and has shown that the demand side can be a real part of their resource plan year-round.

More recently, Jeff Phillips, section manager for software platform monitoring at NI, reports that its LabVIEW software's usual functions of hardware identification, configuration and documentation have been combined onto one software pallet by the SystemDesigner function in its new LabVIEW NXG software, and redistributed by NXG's web module to a wider circle users.

"LabVIEW NXG lets users bring in rich media from added ports, animate and document devices for a user's entire system, and dive more quickly into an interactive workflow. This lets them automate measurements, acquire data, apply analytics, and generate reports—all without having to write graphics code," says Phillips. "At the end of the workflow, NXG's web module can build and distribute remote interfaces. This used to be big challenge, so we introduced the web module that has APIs for using and sending data, web server for deploying data anywhere, and building and hosting interfaces. NXG has a more modular code base, so a lot of other software can soon be brought to market built on the same base."

Brian Hoover, test software architect at South Korea-based battery manufacturer Samsung SDI, adds that “With this next phase of LabVIEW NXG, I can integrate new ways to visualize data, either on the desktop with vector-based UI graphics or in the browser for secure hosting, into my existing LabVIEW applications to simplify reporting test results.”

Simplify, flatten networks

Another strategy for making data more accessible and better decisions sooner is to collapse the networks they traverse, and shorten the distances those volumes of information must travel from I/O and other edge devices to higher-level data processing functions.

FASTER USER/DATA FACETIME

Though there's no one-size-fits-all solution when it comes to knowledge on demand, there are several basic strategies for accessing more useful information faster for improved decision-making. They include: 

  • Select HMI/SCADA software that can be displayed and scaled typically via HTML 5, on multi-sized tablet PCs, smart phones and other handheld or mobile devices. 
  • Adopt easier-to-install and configure commercial, off-the-shelf (COTS) software with fewer time-consuming programming/coding requirements and more point-and-click, drag-and-drop and/or automatic capabilities. 
  • Combine or remove steps to information access with automated drivers, data conversion translation, polling, information distribution and other functions that eliminate traditional hurdles to access. 
  • Seek single-pane-of-glass displays that provide information from multiple sources on fewer interfaces, but at the same time, preserve adherence to human-centered design principles, so display won't become too cluttered for operators to quickly understand what's happening to their processes. 
  • Combine and flatten network organizational structures where appropriate, perhaps by using more Ethernet and Internet methods, but do it securely by maintaining network segmentation, firewall, managed permissions, traffic monitoring, anomaly detection and other cybersecurity solutions. 
  • Consider cloud, virtualization or other software as a service (SaaS) programs to quickly relay and retrieve information between plant, enterprise and other departments, but again, establish and enforce security policies and procedures for all related personnel, contractors and clients. 

For example, Turck just launched its Backplane Ethernet Extension Protocol (BEEP), which has already been integrated into many of its multi-protocol digital block I/O modules. BEEP allows a network of up to 33 devices (one master and 32 slaves) or 480 bytes of data to appear to the PLC as one device on one connection using one IP address. By reducing the number of connections the PLC sees, users can create high-density I/O networks and still use their low-cost PLC.

The BEEP web server makes the first device in the line a master, and it can then scan the entire network and create a new data map that includes all of the downstream devices, with all device configuration options saved in the master. BEEP also supports drop-in device replacement, reducing downtime and overall costs. If a network is setup using BEEP, a technician can simply replace a slave device with a new device to keep the system online. The BEEP master will automatically recognize the device, assign it an IP address, and download the parameters to it.

Charles Fialkowski, process safety director at Siemens, adds its new line of I/O modules carries Profinet protocol directly to field devices, which enables more open communications that aren't limited to a particular manufacturer, greater bandwidth at the field level, and more transparent connectivity between plants and enterprises. Profinet has been a longtime networking backbone for Siemens, though not as much in the process industries, which is why Fialkowski reports it adopted Profinet more fully into its process control, communications and I/O infrastructure about a year ago.

"We call this 'Plug and Produce,' " says Fialkowski. "Previously, we had devices and drivers that required hours of integration. Now, we just plug equipment into the network, and it and its drivers are automatically detected. We support Profinet with one-day workshops, and we've been seeing more interest in them from the process industries, until they're now about 50-50 discrete and process."  

At the same time, Siemens has been developing other tools that combine control functions with common databases, which helps break down former engineering silos and the time-wasting obstacles that go with them, and makes it easier for users adjust, update and annunciate operating tasks. "These tools include Comos software, which is a way to integrate traditionally separate engineering disciplines, and provide intelligence across previously separate silos," explains Fialkowski. "For instance, users in a traditional plant may have no current, as-built drawings, but Comos can help them get the information they need more easily. We're also practicing the digital twin philosophy with Simit simulation software, which lets users conduct tests before making process changes, and do operator training without touching plant systems.     

Cloud closes gaps

Of course, the most transformative way to put useful data within easier reach of users is to replace rigid hardware and dedicated software with more flexible formats on virtual servers and cloud-based computing services.

"Big customers can have their own cloud-style infrastructures with segregated servers on-premise or within larger cloud services, which co-locate their data using enterprise-level accounts that make sure its not accessible or mixed with other servers—but smaller companies can sign up for similarly reliable and secure services," says ITG's Ruiz. "The other benefits of simplifying on the cloud is that users can do pilot projects without major overalls or capital expenditures, and this can cut time to market from months and years to just weeks and days. It's a lot easier to subscribe for cloud computing space and data storage than it is it buy hardware infrastructure, develop code and implement HMIs, SCADA and MES. Plus, the cloud can also make big data, analytics, digital twins, machine learning and artificial intelligence easier based on the data pulled in."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.