Alternate paths from the field to cloud

June 27, 2017
Going off road to move data from the field and plant-floors to enterprises and the cloud includes bypassing the usual control routes.

Everyone's looking for shortcuts. It's a primary preoccupation of modern life and a cornerstone of business and industry. Part of always doing more with less, of course, includes moving production data faster and more efficiently from process applications, field equipment and plant-floors up to decision-makers and enterprise levels.     

In the past, this required running data from sensors, instruments, transmitters and I/O modules through dedicated components, controls and networks that—even though they've evolved from using point-to-point hardwiring to fieldbuses and Ethernet—are showing up as increasingly inflexible. Some process users and system integrators report that adding devices and functions to existing systems, and even making adjustments, typically requires lengthy and complex reprogramming and reconfiguration, if they can be done at all. Think of it as road construction added to traffic jams.

Finding the fast lane

"We're in the process of monitoring several hundred variable frequency drives (VFD) that run the kiln fans and setting equipment at an architectural brick manufacturer because some of the drives previously had failures," says Steve Beck, chemical engineer at Huffman Engineering, a CSIA-certified system integrator in Lincoln, Neb. "We thought about adding a PLC to get the data to the SCADA system, but we also wanted to limit costs, so we added an OPC-UA server, Ignition SCADA software from Inductive Automation and an embedded Ethernet switch/adapter, which can communicate directly with the VFDs. We initially replaced 20 of the drives, and this new solution has already saved about $10,000, not including the labor saved by not having to install, configure, program and get power to another controller. We're already looking to expand to monitoring more drives. If we build out the whole plant, the savings could be six figures."

Though Huffman's VFD project at the brick plant only involves monitoring at present, Beck adds it could eventually be used for control, too. "We're not changing controls yet. It would depend on the success of the monitoring project, but we could bypass the existing controls, and do control with the Ignition SCADA software," says Beck. "We could write setpoints to the parameters in the VFDs instead of writing to the usual PLC, and this would let us use Ignition to perform control functions. Other similar arrangements could do the same, such as using a simple protocol converter with software like Wonderware's InTouch or System Platform, or Kepware's Kepserver. There's not much documentation about how to do this, so it's also important to test hardware and software you're considering. We set up a bench test for monitoring the VFDs, and checked that Ignition, OPC-UA and the Ethernet switch would work before installing them at the factory."      

Colin Geis, product management director at Red Lion Controls, adds that, "The IIoT is evolving quickly and can be classified in a few stages. At first, IIoT was seen as a data visualization tool in the cloud. Data would be transmitted to a cloud repository, and the IIoT platform would present a data dashboard allowing anybody, anywhere to see system data easily. These cloud dashboards could be considered "IIoT v1.” Next, IoT v2 took the data visualization and added intelligence, reporting and alarming, which allowed more autonomous operation, notifying if a system or process outside of programmed conditions and didn’t require an operator to monitor screens 24/7. Now, IIoT v3 is opening the way for distributed control, or bidirectional control with systems and processes. This phase of the IIoT is expanding on data dashboards, centralized alarming and also creating a feedback loop with the edge of the network. While some customers are allowing the feedback loop control equipment, more customers are simply allowing this feedback loop to modify a local database in the field (e.g. confirming that an alarm was received or an action was taken by the cloud platform).

"Another practice we’re seeing is having edge devices alert and alarm locally based on their programming, even as they’re communicating information upstream to a cloud platform where further action may be taken.  Customers must place a large amount of trust in their equipment and network to allow for full autonomous control of their equipment by an IIoT platform, and this hybrid model gives customers greater confidence. More customers asking questions about this functionality, and our equipment is capable of performing these tasks, but it's still a little early to call it mainstream. Customers are deploying more edge control with IIoT connectivity because the visibility to the system and process has improved so much, but it isn’t standard practice to deploy platform-driven edge control yet."

Stephen Neuberger, CEO of Krohne Group, adds that, "Users need standard, uniform, open communication channels to participate in the IIoT without compromising safety and availability. Likewise, modularized and decentralized instrumentation of sensors is allowing them to send data directly to plant enterprise resource planning (ERP) systems. We're not saying the DCS is going to disappear, and it can't because real-time control can't be in the cloud. However, there have to be some added paths for clients' information to take, which improves their ability to add value, and achieve individual production with all the advantages of mass production."

Desperately seeking detours

Just as gridlocked drivers fantasize about cruising on the shoulder, users and integrators have long sought similar ways to get around the networking snarls and data delays up ahead. However, though they could dream, there was no fast lane due to scarce resources, network bandwidth limits and other constraints, which have only recently started to give way to new pathways.

"In our circle of control users, we're all very conservative, but we accept that we must get data from plant floors to businesses. We've worked with custom applications and data collection for many years, but there's a growing tendency to believe that we can't continue with proprietary software, which was a typical solution practice in the past," says Michael McEnery, president of McEnery Automation, which is a CSIA-certified system integrator in St. Louis. It serves mostly large beverage manufacturers and their batching, recipe management, filling and canning applications. "We try to support older operating systems, but keeping up with traditional technology is just getting to be too costly, and users want standardized operating systems, hardware and security.

"Plant managers at one sewer district want to see operations like well levels on their tablet PCs and smart phones, so we're putting together a quote to take their data that's now in a GE iFix historian, and put it in a best-fit application that's cost effective and reliable. Fifteen years ago, we did custom databases and web pages for monitoring and maintaining valves, motors and other equipment, but we don't prefer that approach anymore. Now, data from historians such as Rockwell Automation's FactoryTalk Historian, OSI PI or Wonderware Historian can be delivered by off-the-shelf reporting tool clients that easily integrate with browsers on PCs, tablets and smart phones. We're also seeing more clients considering IIoT and big data, but there’s still more talk than action."

[sidebar id =4]

McEnery adds that phenomena like IIoT and big data are forcing process engineers and system integrators to make some mental shifts. "As engineers, we're supposed to be as efficient as possible, including how much memory we use and data we gather," he explains. "If we didn't need specific data, we didn't capture it because of cost. However, these days, data and memory are a lot cheaper, so the new mindset is to collect as much as possible, and then use analytic software to see if it's useful later."    

Mike Boudreaux, director of the Connected Services program at Emerson Automation Solutions, adds that, "A wide swath of users are recognizing that new analytics, applications, software and other capabilities can be enabled by connecting to existing systems. So, while getting DCS data still makes sense, they can also deploy wireless and other network points that aren't needed for control and safety, and get valuable pressure, temperature and flow measurements. Many process applications don't need millisecond or 1 second updates, and instead have updates of a minute or more, so some users are looking outside the process control network to develop a 'process data network' with many of the same devices. For example, Emerson's wireless pressure gauge now has an electronic sensor and digital wireless instead of the traditional mechanical device with dials."

Of course, once this data reaches an Ethernet network, it can be delivered immediately to the enterprise, the cloud and other server-based applications for analysis and performance/reliability monitoring on laptops, tablet PCs, smart phones and other interfaces. "Emerson and other providers can host applications, and even do monitoring for clients as a service," adds Boudreaux. "It's a lot like a cable TV subscription. However, this only monitoring; we're not doing control and safety functions in the cloud, though wireless for monitoring is proven, and more efforts to leverage the cloud are happening."  

History of shrinkage

While the idea of skipping the PLC or DCS might be unthinkable to many engineers, operators and managers, this concept is just one chapter in the epic story of faster, more powerful, less costly and smaller computing devices in industry and elsewhere. The journey from clunky hardware to virtual software seems inexorable in all cases. 

"We should remember that PLCs have shrunk a ton since they were first introduced," says Michael Robinson, national marketing manager for projects, solutions and services at Endress+Hauser. "The first 8-bit PLCs in the 1970s were big and very costly, but by 2012, they were 32- or 64-bit, fit in your hand, and cost only $500. Ten years ago, getting data from instruments was cumbersome and expensive with analog inputs and outputs, but for today's users that just want to  visualize or analyze their data, it may still be difficult to integrate and network several plant controllers. Everyone uses the same Ethernet cables now, but the fieldbus protocol wars of the mid-1990s still haunt us today, so we still have five communication cards talking different languages.

"Nevertheless, many users want to aggregate and visualize data, and not necessarily do control, or deal with the extra costs of adding PLCs and HMIs. They just want to monitor their flows and tanks, and communicate that data to the cloud. These efforts began with HART, Profibus and Ethernet gateways and WiFi or cellular modems. We also made chart recorder replacements, Memograph and Ecograph, which brought instrument data in, and did local visualization. Now, it's just shared with the enterprise via the cloud, and the gateways are called IoT or edge devices, some of which can run open-source code like Python to manipulate their data streams. As result, many users ask themselves why they need a PLC if they're not doing control?"

Dave Emerson, director of the U.S. technology center at Yokogawa Corp. of America, adds that, "The influx of IT into the automation space is increasing. It's obvious, just like when computing went from Unix to Windows, and the same is happening to distributed controls. If you want to add I/O points to a DCS then you typically need to call a specialist, but new, low-cost sensors, wireless and edge devices are getting data to the cloud faster and at less cost."  

Bruce Billedeaux, senior consultant at system integrator Maverick Technologies, a Rockwell Automation company, adds that bypassing PLCs can be accomplished. In fact, doing it was one of the initial visions of the HART protocol, which was developed in the mid-1980s, and originally sought to perform distributed control through a 4-20 mA loop. "However, HART was too far ahead of its time back then, and users couldn't handle the communications needed," says Billedeaux. "Now, HART is getting more of backbone it needs, and we're going to become ubiquitous as more controls move to the cloud."

Simplifying with SCADA

Once data from any sensor, instrument or other field or plant device touches Ethernet and the Internet, it's basically off to the races, and users at every level from the enterprise to the cloud can access it if they're speaking the right protocol and have the right permissions and programming. Whether it's bypassed controls or not, this information also needs a place to land and a SCADA/HMI to display it, though these interfaces are also simplifying and standardizing around the Internet's well-known, Ethernet-based Transmission Control Protocol/Internet Protocol (TCP/IP).

[sidebar id =5]

For instance, vegetable oil and grain processor Adams Group Inc. in Arbuckle, Calif., recently worked with system integrator Calcon Systems in San Ramon, Calif., to also implement web- and Java-based Ignition software for supervisory control, data acquisition, manufacturing execution systems (MES) and IoT solutions in its vegetable oil plant. Adams is the largest U.S. supplier of organic oils, expeller-pressed oils, including sunflower, safflower and canola.

“We just wanted to share data between two systems, and Ignition's advantage is that it's all in its open database,” says Shawn Ferron, project manager at Calcon. “The information isn't in some proprietary database you can’t access.”

Adams reports Ignition enables more efficiency, better visual displays, more data, faster implementation and is easier to use than its previous SCADA systems, and adds it may apply it company-wide. About 20 operators and managers in the vegetable oil plant's control room use Ignition on nine computer monitors, which show KPIs and details about batches, alarms, downtime, runtime, shift performance and historical data (Figure 1). "We've seen an increase in some efficiencies that were tangible,” says David Kay, operations manager at Adams Group.

[sidebar id =1]

Lee Smith, general manager at Adams Vegetable Oils, adds that Ignition's unlimited licenses gives his operations more flexibility. "I didn’t want to be limited by how many licenses I could have, or how many screens I could put in my plant, or how many people I could have looking at a computer at night,” he explains. "I didn’t want to have to buy a new client every time we wanted to do something. The future is open-source technology and limitless licenses.” As a result, Adams has created several projects with Ignition, including dashboards with graphic displays of KPIs and other information and a work order creation system.

"Certainly, Ignition is SCADA software, but in fact, SCADA is only part of the overall platform. It's vital to bring device data into the infrastructure, and make it available to the whole enterprise. To do that, you have to decouple devices from applications," says Don Pearson, Inductive's chief strategy officer. "The Ignition industrial application platform and its unlimited license model are perfectly suited for this task. Old licensing models break the bank because they couple applications and devices, and charge by connections, users and tags, effectively stifling innovation.  They're complex and costly when users want to get more data, add devices and scale to hundreds of thousands of connections across an enterprise.”

Likewise, eight oil and gas production facilities in Oman's Mukhaizna field have used steam-assisted gravity drainage to increase recovery for more than 10 years, but their data collection systems based on Rockwell Automation's PLCs linked to Iconics Genesis32's HMI/SCADA software began to need help as the facilities' data points multiplied over the years. They'd been using Top Server OPC server from Software Toolbox to gather data from up to 20 PLCs and feed it to the HMI, but recent equipment additions had increased tags in the system close to 30,000. This wouldn't usually be a problem for Top Server, but the HMI was also requiring it to make device reads that bypassed the server's device-level optimization, and the HMI was requesting out-of-order updates on groups of OPC tags—which both slowed the data collection process.

“The OPC server seemed to be dying under the load,” says Juan Munoz, project manager at the Mukhaizna oil field project. “Even at rates as low as once per second, it was difficult to scan 30,000 tags, and get the critical data changes that we needed.”

[sidebar id =10]

Knowing the server wasn't at fault, Munoz reports he searched Software Toolbox's website, and found Cogent DataHub, a memory-resident, real-time database from Cogent Real-Time Systems, a subsidiary of Skkynet. Acting as an OPC client to Top Server, DataHub can request data based on tag value changes, which are called “asynchronous advise.” This means that, instead of 30,000 tags per second, the server only sends data for a tag when it changes value, and it's free to poll the devices in the most efficient way, always keeping DataHub up to date with the latest data values. DataHub also keeps all the latest tag values in memory, and can efficiently send them to the HMI on each poll.

“DataHub effectively decouples the OPC server from the client,” adds Munoz. “All the load is on DataHub’s shoulders now, and the performance is much better.” Top Server is now free to optimize communications to the device, while DataHub protects it from device reads. This has relieved Mukhaizna's users from having to redesign their HMI and PLC configurations, saving tens of thousands of dollars in engineering and development work.

Teresa Benson, product marketing manager at Red Lion, explains, "We continue to see two key considerations—simplicity and security—driving decisions in the field-to-enterprise area. While some sensor and instrument manufacturers offer “IoT-enabled” technology (for example, cables with built-in power sensors and network connectivity, or small Raspberry Pi-like cellular devices bolted on to sensors), most users still employ a simpler aggregate-and-act-architecture. Even in applications where we see LPW/WPAN sensor networks, ultimately there are one or more gateway devices that act as converter, aggregator and occasionally actor, though they usually send data upstream (if even just upstream in a plant to an edge device) for decision and action. This aggregation architecture also gives customers a lower cost of entry into IoT, as they're usually able to start with existing sensors versus requiring new and potentially more costly sensing equipment or instrumentation."

"While the choice of backhaul medium (such as cellular versus wired Ethernet) is usually driven by geography of deployment, we’re increasingly seeing it as a choice about network security.  Sometimes, especially for OEMs who are deploying machines with a ‘remote service’ feature or built in recurring revenue stream, manufacturers are building cellular communications into equipment that may be installed on premise. The most common reason we've found is that they want to offer connected services such as remote diagnostics and maintenance. Offering cellular connectivity, and therefore a completely separate, owned and managed network connection, is a way to reduce concerns of their users' IT teams about having to provide and maintain a secure, accessible node on the network for a third party."

IIoT and MQTT

No doubt the main draw of bypassing traditional controls is it simplifies the trip from the field up to the enterprise and its cloud-computing services and Industrial Internet of Things (IIoT) participants. There are several ways to do this job, but one that appears to be gaining the most ground lately is Message Queue Telemetry Transport, which is a publish-subscribe, machine-to-machine messaging protocol that sits on top of TCP/IP.

"Traditionally, a HART transmitter would bring one value from a process device such as a Magnetrol level instrument, and send it via 4-20 mA to a PLC using a poll-and-response protocol," says Arlen Nipper, president of Cirrus Link Solutions and co-inventor of MQTT. "Usually, there are many applications and devices mish-mashed together, and they eat bandwidth, aren't scalable for adding applications, and are difficult to maintain. We think a new architecture is needed that decouples applications from their devices, and uses MQTT as a broker that can publish data from the application side, and let users on the subscriber side use it as needed.

"MQTT lets us eliminate the PLC by using hybrid poll-response or publish-subscribe protocol to free us from traditional poll-and-response. When these devices plug in, their tags are explained, they don't need configuration, they immediately begin storing history, and they let users easily make screens and alarms."  

Travis Cox, co-director of sales engineering at Inductive, adds that, "Sensors and field devices usually send one or two values to their PLC, but there can be hundreds of others that don't go to the controllers. In the case of Magnetrol's level probe, the question is what happens when you have an problem, and the level indication isn't enough for troubleshooting? Usually, it means a technician has to go and physically read the 1x4 in. LED screen on the device, and a lot of other data getting stranded. As a result, we worked with Magnetrol, and added a little Blue Tech PC with an ARM processor, HART modem and cellular connection to read the instrument's 480 local data points, and publish them to the cloud. Sometimes you need to skip the PLC because some data can't get to it, but it's still important for troubleshooting. Plus, many users can't put a PLC in a certain location because it's just too expensive."     

To bring MQTT down closer to operations, Inductive recently introduced its Ignition Edge Panel software to create local HMIs for field devices with local and remote web clients, Ignition Edge Enterprise that synchs data from edge devices to a centralized server, and Ignition Edge MQTT that turns field devices into MQTT-enabled, edge-of-network gateways.

"MQTT is the next big development for transferring process data from sensors and field instruments to PLCs, DCSs and higher-level MES computing, Internet and cloud-based solutions," says Aldo Ferrante, president of ITG Technologies, a CSIA-member system integrator in Jacksonville, Fla., which recently developed its own cloud-based, IoT platform called Smart Operational Real-time, Big data Analytics (SORBA). "An MQTT broker manages data between devices and can handle small amounts to very large blocks of data making it very versatile and provide an industry standard between multiple systems.

"MQTT offer enterprise-level security and data encryption to protect the integrity of the system. And, as sensors become smarter and self-contained, decisions don't need to rely on a master control platform. Instead, embedded logic allows devices to control by exception, reducing the need for lightning-speed, isolated networks. Lightweight, embedded processors with MQTT can enable and extended the ability to distribute sensors over a vast geographical area."

The hardware way

While it can seem like software is doing all the work and having all the fun of simplifying networks and circumventing controls, there are actually several different hardware-based approaches that can achieve the same goals—even if they rely on software, too.

For instance, Opto 22 recently released what it reports is an industry-first representational state transfer (RESTful) application program interface (API)  and server to its programmable automation controllers (PAC), which will accelerate adoption, rollout, and ROI of IIoT applications by flattening IIoT architectures, reducing complexity and eliminating middleware. Through this new RESTful API, developers gain secure programmatic access to new or legacy physical assets through control variables and I/O data using any programming language that supports JavaScript object notation (JSON). Available through a free updated firmware release for Opto 22's Snap PACs, its RESTful API includes an HTTP/S server accessible from any HTTP/S-compatible client.

"Putting RESTful API right down in the PAC means data requests are open and documented, and just reference the API's documentation, so it's clear what data to pull," says Benson Hougland, Opto 22's vice president marketing and product strategy. "This means data requests don't need to set a database ahead of time or give a tag list. This is a much more fluid method for moving and sharing data among entities in an simple, open and understandable way. The industrial automation and control industry is in transition right now. Vendors that have relied on a product development strategy based on proprietary and closed technologies have become outdated. The future of the industrial automation and process control industries lies in the rising API and data economies made possible through open standards-based technologies."

[sidebar id =6]

Philip Marshall, CEO of Hilscher North America, adds that, "Users want stable systems that are robust against attacks, but no one wants to patch their PLCs three to five times per week. This is where edge devices on the front line can really help. Because edge devices aren't doing critical control functions, they're easier to secure and patch if necessary. Hilscher’s netIOT Edge gateway with our netX communication chip inside can be air-gapped for read access only from IT/cloud applications. This prevents external intrusions, as data can't be written from the cloud to the gateway. In passive mode, the Edge gateway 'listens-to-all' on the existing control network. Its main activity is aggregating mostly real-time Ethernet and IIoT data, such as OPC UA and MQTT, and configuring that data to formats that can be used by management systems, such as those from IBM, Microsoft and SAP. (Figure 2) 

[sidebar id =2]

Marshall adds that two netX microprocessors have been added to Hilsher's existing family of five. The two new netX chips enable OPC-UA and MQTT communications, and include security features such as secure boot and encryption.

"There's a lot of benefits to bypassing PLCs and/or working with PLCs to bring only the data that's needed to the cloud," explained Marshall. "If you go right from the PLC to the cloud there can be added security risks for operations. We're not disrespecting PLCs. It's just that traditional control setups that you don't touch for 20 years and the mentality that goes with them don't work with the Internet. If you're going to work with the cloud, you need to use dynamic systems that are hardened against attack and patched regularly like PCs, which don't need to resolve requested updates with traditional controller software. We're not out to replace PLCs. These edge devices are adjunct that just take IIoT-related data to the cloud, and don't disturb or impact the process control that's going on."

Likewise, coming from the hardware side, Advantech B+B SmartWorx reports its Wzzard hardware and protocols, including MQTT, work together to reduce the expertise and time needed to build scalable IoT connections. These devices include Wzzard intelligent edge nodes that connect to industry-standard sensors; SmartMesh IP wireless sensing technology that enables auto-forming, self-healing, self-sustaining networks that are also highly scalable; and Swarm 341 Gateway that connects equipment and devices to the Internet or Intranets over wired Ethernet or wireless cellular connections.

"The sensor nodes pick up auxiliary signals from the process or automation system, and reports them to the gateway, which can have aggregation points such as smart metering," explains Paul Kutch, IIoT solutions sales director at Advantech B+B SmartWorx. "The aggregation points let us add intelligence at the edge, enrich data, pick reports by exception, and reduce data consumption. For 40 years, we've been connecting devices to applications, but we need to think more about connecting devices infrastructure, and then plugging in applications. By using MQTT, Wzzard can publish temperature sensor data, for example, and 100 other devices can subscribe and use it as needed. We have to drop these monolithic, poll-and-response hosts that bottleneck our data."      

In addition, Turck Inc. recently introduced its field logic controllers (FLC) that bring simple logic programming down to the device level via IP-rated Ethernet I/O blocks with built-in FLC technology, which can run standalone without any PLC or cooperate with or backup PLCs. FLC technology uses a flowchart system to custom program the local Ethernet I/O blocks via an HTML5-compatible web browser. Through a series of drop-down menus, engineers can set up multiple conditions, operations and actions on one block. FLC also allows users to create high-density I/O without a PLC.

Similarly, Littelfuse Inc. recently launched its MP8000/MP8100 Series Bluetooth-enabled motor protection relay that allows maintenance personnel to communicate with it from up to 30 feet away using an app on a tablet PC or iPhone or Android smartphone. Once a smartphone is securely paired with MP8000/MP8100, users can easily monitor system status in real time, set up the relay, adjust its settings, and review its fault history. The app is intuitive and requires no training to use.

Varun Nagaraj, CEO of Sierra Monitor Corp., adds that, "Innovations in these technologies is all about novelty and value creation. For example, Digi International provides its model-oriented sensor cloud, while Monnit Corp. does all kinds of remote monitoring, and Samsara manufactures Internet-connected sensors systems. Even voice activated Amazon Echo is getting into building and facilities management, and could probably perform some monitoring or control tasks soon."

Wireless weighs in

Because Ethernet provides the physical path for data to skip controls and/or simplify networks and communications, it's no surprise that wireless can help, too.

Emerson's Boudreaux adds that users in operations are examining other common, non-critical functions that might be candidates for network simplification, including bypassing controls. "Heat exchangers have instruments and controls that usually go to the DCS to control temperature, but they also have pressure signals that don't go to the DCS, and these can be used to check for build-up and indicate the need for maintenance,' explains Boudreaux. "Now we can replace a mechanical pressure gauge with a wireless one, and integrate measurements with heat exchanger monitoring software, so engineers can assess heat exchanger performance without opening it and with more certainty about what needs to be done. And, combining pressure sensing data with temperature and flow means we can also do thermodynamic evaluations on heat exchanger efficiency that are more holistic.  

"Control is possible in some limited use cases using wireless measurements, but the biggest use case now is collecting data and monitoring applications outside of control, and finding areas that aren't solved, such as heat exchangers, pump reliability, steam trap maintenance and others in the future. My thought is the opportunity isn't so much bypassing controls, but rather putting measurements into monitoring networks where they make sense. We wouldn't wire a control signal into a safety system unless it was required for safety, so why should we wire a monitoring signal into a control system if it isn't required for control?"

McEnery adds that wireless can even deliver signals right to cloud-computing platforms and other server-based applications. For example, McEnery Automation recently took over integrating and maintaining remote terminal units (RTU) in some of the municipal water/wastewater systems operated by American Water Enterprises in St. Louis. McEnery has worked with American Water for many years, but this project involved RTUs implemented by Mission Communications Inc., which automates small pumping applications with just a few I/O points for level indication and pump control, and uses digital, cellular-connected radios to reach the Internet, and feed data to Mission's server and cloud service. It also employs an OPC-UA server to gather data from all the locations in a customer's facility, and send it to a Schneider Electric Wonderware SCADA system in their control room.

"RTUs have microcontrollers, so their pumps can run on their own and collect data, but in this case, their information skips the usual PLC or DCS, and goes right to the cellular connection, Internet and server, which sends it back to the customer's SCADA/HMI via the OPC connection and Wonderware Historian," says McEnery. "This is a very cost effective approach when having to communicate with many small remote devices."

Sustaining security

One of the richest ironies created by doing an end run on typical control infrastructures is that PLCs and DCSs can sometimes do the same—skip their own infrastructure, get on the Internet, and provide information without going through their usual channels. However, whether data goes around the PLC or DCS or through it, security remains a top concern for all users, integrators and suppliers.

"Probably the largest three arguments about using existing equipment versus bypassing it is security, access to relevant data, and impacts to current equipment/processes," says Red Lion's  Geis. "Using existing equipment to access data can be an easy integration point, but how secure is the device (PLC, RTU, drive, etc.)? PLCs and most automation equipment were never designed to be deployed on a publicly accessible network and lack effective security from external threats. Integrating existing equipment to a secure IIoT gateway can provide security, but also may open up the control network if not properly configured.

"Does the device have access to all the data or sensors necessary to allow for a systematic approach to process improvement? Existing equipment was deployed for an application, e.g. control, monitoring, maintenance, etc. As such, the data collected from that equipment is likely to be application-specific. IIoT is about breaking that paradigm, and allowing data to be accessible to any application that requires it. What customers are finding is that while using existing equipment is an easy first step in the process, unfortunately this method doesn’t often provide enough data for a clear picture. Finally, customers need to evaluate the impact on the current process for installation of the new data collection system. Does the system/process need to be shut down for the installation of new sensors or to tap into existing sensors? Do new PLCs, RTUs, DCSs, I/O, Ethernet, and power need to be run to the new locations?"

ITG's Ferrante adds, "The benefits of bypassing PLCs  and DCSs are complete autonomy and flexibility. Building complex multi-dimensional structures and control algorithms will be simpler and it makes the software easier to develop and data easier to collect and analyze. There's no longer a need to build data handling software in PLCs and DCS systems to reach its final endpoint. Mobility is another big benefit because you can analyze sensors and instrument from your mobile device.

"However, security is always a concern, even though the same secure layer is implemented using MQTT, and extra caution and design consideration need to be reviewed. At first, troubleshooting can be slightly more challenging because there's no direct path to a device. Also, trying to diagnose the origin of sensors belonging to devices requires better tools and methods to determine root causes. These tools include better analytics and learning algorithms to model multi-dimensional problems."

Cloud reaches down

Just as field devices and plant applications are seeking simpler paths up to the enterprise, some cloud services and IIoT platforms are exploring top-down ways to access data from those components that may not include the usual control infrastructure.

Likewise, Honeywell Process Solutions (HPS) recently launched its Connected Plant program, which combines the domain knowledge of its UOP division with its own partner network and other capabilities. Paul Bonner, vice president of consulting and data analytics at HPS, reports that Honeywell Connected Plant improves users' production efficiency and process reliability and optimizes their supply chains by augmented their process knowledge with OEM equipment analytics. The program also relies on Honeywell's Sentience Cloud, which is based on Microsoft Azure cloud-computing service. Field and plant devices deliver data to IIoT components that relay it via wireless to Sentience Cloud, which distributes it to analytics, modeling, asset management and other software tools, and to partners in Honeywell's INspire joint innovation program. All of this is done without disturbing existing DCS and MES infrastructures, which can also report to Sentience Cloud via a firewall (Figure 3). [sidebar id =3]

"Analyzing plant performance with a cloud-based service enables round-the-clock monitoring of plant data and rigorous simulations; provides ongoing, operational recommendations to close performance gaps; and employs UOP's process models and experience in operational support and troubleshooting," says Bonner. "The customer value of this is 30-50 cents per barrel in refining and $10-20 per metric ton in petrochemicals."

ITG's Ferrante adds, "The development of alternate paths will evolve to other alternate networks, such as mesh-canopy networks, Wi-Fi and cellular networks. In addition to sensors and control data, video and textual data will be combined to build even more complex models to decrease downtime and improve operations. The concept of smart virtual operators will replace the traditional operator as we know it, reducing the need for manual labor and people required to operate manufacturing facilities."

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.