It always helps when someone else scouts ahead, cuts through the jungle first, and sends back guidance about sought-after destinations and avoiding hazards. And, just as geographical explorers assist travelers who come later, computing/networking pioneers can research, develop and experiment with new and emerging digitalized technologies to help those not as far up the learning curve. This is especially true for information technology (IT) experts, whose know-how is becoming more crucial to users in process control/automation and other operations technology (OT) disciplines as their industries are increasingly overrun by the Internet and its eponymous and unnecessarily named Internet of Things (IoT) and Industrial IoT (IIoT).
Buzzwords aside, as soon as software, microprocessors and wired/wireless Ethernet touched the plant floor, it was inevitable the Internet would support and take over from fieldbuses and their protocols—just as they previously supported and took over from hardwiring. The only question now is how make it simple, effective and secure in process applications, as well as the HMI/SCADA, MES, ERP, cloud-computing and other functions that serve them.
Lower hurdles, less bumps
"We began by implementing our building automation system (BAS) about 10-15 years ago for a multi-site pharmaceutical application with a standard automation approach using PLCs, thick clients and servers. Eventually, we moved to progressively smaller PLCs and ARM-based embedded industrial PCs with a private cloud, and more recently expanded to public cloud-hosted servers using IIoT sensors and devices," says Chris Hamilton, industrial information technology/operations technology (IT/OT) director at Grantek Systems Integration, a CSIA-certified system integrator (SI) in Burlington, Ontario, Canada. "In the past, this required a massive, costly BAS, along with a large investment in infrastructure, which wasn’t viable for medium sized or remote facilities that were equally as critical to our customers."
Hamilton adds that Grantek helped its longtime pharmaceutical customers extend the range of their temperature monitoring, ruggedize their equipment, and implement other BAS services with sensors and transmitters over the past four or five years. Previously, its customers were sending data to PLCs, but now, by leveraging its ARM-based embedded industrial PCs, Grantek can also aggregate data from other sources, such as Modbus and DNP3 devices, and leverage Amazon Web Services (AWS) cloud-computing services, including EC2 and RDS, for analytics and historization. "The customer needed low-cost monitoring of all their sites at a central location and on top of their engineering architecture," explains Hamilton. "By leveraging industrial protocols like EtherNet/IP, and building on our experience in the industrial space, we could implement a more viable system built on low-cost embedded devices." Grantek has continued to evolve its offering by adding plant floor-dashboards, powered again by ARM-based embedded PCs, delivering real-time and actionable data on which customers have come to rely.
Following its BAS system's success at Internet-based sensor monitoring, Hamilton reports Grantek expanded its use in power generation, solar and data center monitoring. "Due to their low price point, these simple devices together cost hundreds of dollars," adds Hamilton. "Previously, controllers used for these tasks would cost thousands of dollars or more."
Dave Emerson, vice president of the U.S. Technology Center at Yokogawa Corp. of America, adds that: "Early adopter end users are taking virtualization, cloud computing, IIoT and mobile computing into account when designing process applications to increase efficiencies, such as streamlining access to data and simplifying work processes. Virtualization reduces hardware requirements, delivers greater flexibility about where to run software, and reduces obsolescence risks tied to hardware or old operating systems. Cloud computing can deliver faster deployment while reducing deployment costs, and it enables access to the same applications and data sets among sites. IIoT enables much greater variety and volume of data to be brought into the corporate data pool at a lower cost by integrating new sensors and measurements into the network. And mobile computing enables easier access to data, timely alerts and collaboration tools.
"An example of these technologies is Yokogawa's cloud-based engineering environment, which is deployed globally. By locating the engineering test system in the cloud, any authorized Yokogawa engineer or customer can access it at anytime from anywhere. It gives us vastly increased flexibility for scheduling engineering resources during project execution. A controller emulator function is implemented as a fundamental part of the Yokogawa engineering environment. This is applied to cloud-enabled engineering, allowing engineers to seamlessly start remote testing on the cloud once the system configuration is completed."
Simpler, sharper interfaces
Probably the main way that common IIoT and web-enabled methods benefit users is by simplifying network pathways and access, which allows more consistent and detailed HMI display of their data that enable faster, better decisions.
Figure 1: Chobani uses web-based, unlimited-licensing Ignition SCADA software on filling and packaging lines at its three Greek yogurt plants; for asset management, ERP and capex planning; to give all staff access to plant floor-to-enterprise data to improve efficiency; and to enable IT and OT convergence. Source: Chobani and Inductive Automation
For instance, in late 2017, Chobani started the third expansion of its world's largest yogurt plant in Twin Falls, Idaho, which is adding a $20 million R&D center to its 1.4-million-square-foot facility. Much of Chobani's meteoric rise is due to the public's rediscovery of Greek yogurt, but it's also because the company is a long-time user of web-based Ignition SCADA software from Inductive Automation, which it uses at all three of its plants on filling and packaging lines, and for quality control, asset management, enterprise resource planning (ERP) and capital expenditure (capex) project management. Hugh Roddy, global engineering and project management VP at Chobani, reports Ignition lets staff access data they never had before, view it from the plant floor to the executive level, and share it to improve efficiency and reduce downtime (Figure 1).
“Once we took Ignition onboard as one of our enterprise platforms, everything improved exponentially across the board from an operational standpoint,” says Roddy. "Having data from Ignition at their fingertips helps our employees be more efficient, and it makes them feel part of the team. Ignition also lets us integrate the convergence of our operational technology (OT) and information technology (IT) environments into one platform to create more efficiencies."
J.C. Givens, global network services manager at Chobani, adds: “Ignition has been a good bridge for OT/IT collaboration. We’ve been able to make gateways available to both networks, so whether people are in the office making decisions or on the plant floor making decisions, IT and OT information are both available.”
In addition, Chobani reports Ignition's unlimited licensing arrangement gives it the flexibility to quickly keep up with production increases by rolling out as many devices and clients as it wants in as many places as it needs, or set up a single HMI for a special requirement, which happened in its new facility. “Previously, operators used a radio to call in, and had someone start each process step for them,” says Trevor Bell, automation engineer at Chobani. “With Ignition, we can put a special HMI out there, just for them. Ignition makes it cost-effective to do a one-off scenario like that.”
Standards start to solidify
One way to get new technologies and their users on the same page is to try and agree on standards for applying them, or if they have enough momentum, formalize the defacto standards that grow up with them. Though there aren't really any completely settled or formal or IIoT standards yet because it expanded so fast, there are a bunch of long-time Internet standards from the IT side that can be useful.
Read more: Get into the IIoT mindset
"Process applications are being impacted by IIoT and mobile computing in rapidly expanding ways. The primary gains being experienced involve data democratization, which leads to data sharing, storage, analysis and better decision making in near real-time," says Benson Hougland, vice president of marketing and product strategy at Opto 22. "De facto standards are fine, but international standards defined and accepted across many industries are better."
Hougland reports one of the IIoT-enabling standards is message queuing telemetry transport (MQTT), which is already established as ISO/IEC PRF 20922. "MQTT has emerged as an alternative to traditional poll-response mechanisms for getting data out of operational systems by using a publish-subscribe model," he explains. "The advantages of the MQTT publish-subscribe (pub-sub) model are many, but the key issues addressed are security and performance. Pub-sub models like MQTT don’t require inbound network interface ports on OT devices (like PLCs) to be opened, which mitigates a major security risk. Further, MQTT methods publish data only on change, while maintaining a connection at all times for state management and bidirectional data transfer and control."
Another cornerstone standard that's simplified the IIoT landscape is the long-established and well-known transmission control protocol/Internet protocol (TCP/IP), or Internet protocol suite. It's managed by the Internet Engineering Task Force, which handles IP as RFC standard 791, and TCP as RFC standard 793. In fact, MQTT and a host of other mostly four-letter protocols are organized as part the suite's application, transport, Internet and link layers.
"TCP/IP isn’t usually considered an emerging standard as it’s used by nearly every networked computer, mobile device and most PLCs and PACs in existence today," adds Hougland. "However, the TCP/IP protocol suite is the grandfather of all networking standards in use today."
Yokogawa's Emerson adds: "MQTT is frequently used in new IIoT applications, products and services. It's low cost, has low overhead and is fast to implement. However, as users require more sophisticated results and applications, they'll need an overarching information model that relates all the low-level MQTT values into a contextual model that various technologies, including advanced ones like AI can utilize. When this maturity level is reached, users need to move from MQTT to OPC UA, which provides the more sophisticated information modeling required for advanced applications. OPC UA can run on top of MQTT, so they can be used together. OPC UA can also run on top of other protocols, allowing multiple protocols to be used within an enterprise."
Caution and care still required
Despite its many advantages and increasing reliability, IIoT is still used mostly for monitoring and non-critical control because its communications and networking can still suffer interruptions and outages, especially when it uses mainstream Internet and public infrastructures.
"Potential users need to define their value proposition to decide if they can use non-standard or non-industry-norm technologies," says Grantek's Hamilton. "For instance, we wouldn't put a consumer grade device like a Raspberry Pi on a critical batch control system. However, if we're doing visualization or other non-critical functions, then we can be more creative. It's also important to work with a vendor with a mature team that can use the right software development process, including version control and unit testing. This is important because these systems need to continue evolve and grow to support our customers’ ever-changing needs. A traditional control system is a beast that doesn't change as easily."
Because Modbus, DNP3 and other fieldbuses perform well on the plant-floor but need an IP protocol for IIoT, and because MQTT is a transport protocol that efficiently moves data but doesn't natively store-and-forward it like plant-floor networks do, Hamilton adds that some kind of protocol conversion is usually needed to get data from one realm to the other. "For example, it's a bad idea to try and do OPC UA directly over the Internet because it isn't a hardened WAN protocol, and it isn't designed to handle Internet latency and security requirements. So, when we're going from OPC UA to a Raspberry Pi, we need to use protocol conversion such as Ignition Edge or Kepware, or run some custom software on an embedded device. For users to be happy with IIoT, they must understand the landscape of how their devices are pulled together on these networks."
Richard Beeson, CTO at OSIsoft, adds: "One concern with IIoT and the way it's evolving is that an application can still end up with data silos when they're seeking process optimization and management. IIoT may solve one problem, but if you're trying to look at a whole process operation, then you may still need a bigger picture using plant and cloud-based data together. For instance, we recently worked with Petasense and used a Fitbit-type sensor with digital signal processor to do vibration monitoring on rotating equipment, and send its data to the cloud for analysis and optimization. Plus, if different patterns were observed, we could begin to determine how it related to upstream performance and materials, and whether they could be detrimental to the overall process."
Beeson adds that OSIsoft has bridged diverse systems across plants and applications for many years to give user higher-order perspectives, but this landscape began to collapse as OPC UA allowed many to talk the same language and as IIoT emerged and exploded. "What we're finding is even more diversification and new silos, and users that want to bridge all of them with IIoT," adds Beeson. "This can be done with the traditional OSIsoft approach, in which each IoT stack serves as a data concentration point for sensors, and efficiently makes SCADA or DCS information available. We're continuing to do that, but we recently kicked off a new project, Linux Foundation (LF) Edge, which pulls together multiple open-source efforts using edge-of-network technologies. Instead of allowing network silos to persist, LF uses a hub gateway at the edge to get source devices to agree on common mechanisms for configuration, create a common middle playground, and allow anyone to talk via IIoT to any vendor's cloud, sensors and other devices."
Beeson adds that publish-subscribe protocols like MQTT and AMQP can be helpful, but they only "ship buckets of bits," so users also need to establish context for their data to use it. "In the process control world, Sparkplug MQTT is getting lots of uptake because it's open-source and doesn't require users to build their own MQTT application, but user still need a translation layer to understand their data. For example, we know how to format data on one side and interpret it on the other side using our OSIsoft Message Format (OMF) that uses a simple construction to encode sensor information or time-series data (TSD) and events, and add context later in the same way they digital twins are defined."
IP streamlines operations, too
When carefully applied, another advantage of IIoT and its common, IP-based networking methods is they can be simpler and faster to deploy than traditional fieldbuses and the data translation, configuration and other tasks they're assigned.
Figure 2: To gather data from controls at remote power plants, Siemens Buenos Aires replicates data from a T3000 DCS to an interface PC and Siemens WinCC OA SCADA system and server; uses OPC to relay data because T3000 has an OPC server and WinCC has an OPC client; and tunnels OPC data over TCP with a company VPN and selected Cogent DataHub tunneling software from Skkynet. Source: Siemens and Skkynet
For example, Alexis Tricco at Siemens Buenos Aires in Argentina, recently undertook its first digitalization project as part of his firm's overall digitalization program. To provide technical support backup for power plants' generating operations, Tricco and his colleagues typically supervise and upgrade client operations, but the digitalization project assigned them to develop a reliable, secure way to collect data from controls at power plants located hundreds of kilometers (km) away from their office. The first phase was a pilot connecting Tricco's WinCC OA SCADA system to a Siemens T3000 DCS running at a power plant about 100 km from Buenos Aires, as well as include the plant's control network and a multi-customer network (Figure 2).
“My idea was to bring all the process data onto my WinCC OA server running on the customer network,” says Tricco. “To get this, I needed to replicate the data from the T3000 to the interface PC and from there to the WinCC OA Server. Tricco chose OPC to relay plant data because its T3000 had an OPC server and WinCC has an OPC client. However, since OPC DA doesn't network well, he decided to tunnel OPC data over TCP with a company VPN, and selected Cogent DataHub tunneling software from Skkynet.
“I needed to communicate over different networks, with end points that could convert between TCP and OPC, acting as server and client simultaneously,” explains Tricco. “DataHub has an OPC server on one side and an OPC client on the other, which is what I needed. Other software would've required two licenses for each PC, and I had to think of the costs. DataHub was user-friendly and wasn't complicated, and I got it working in less than a day."
Following the pilot's success and the client's acceptance, Tricco can go online from his WinCC OA server in Buenos Aires, collect OPC data from the plant's T3000, and perform real-time analysis. The plant's engineers can also monitor the performance their gas turbines, and optimize combustion and control emissions to meet regulations without going onsite. Thanks to digitalization, the plant is running at higher capacity and reduced emissions, while the client plans to implement it at two more plants. “In fact, just sitting at home, I can connect to our VPN and customize processes in a couple of hours," adds Tricco. "With data from the customer, we can choose which equipment to monitor and decide if we need to further optimize performance.”
Similarly, Cemex established its Cemex Energía subsidiary five years ago to reduce its electricity costs and CO2 emissions by using renewable energy sources such as wind to generate power for cement production. The division builds power plants and provides asset management services to other companies, and is working on more than 20 projects worldwide. Cemex Energía also owns and operates three power plants in Mexico, two wind and one geothermal, which have 1 gigawatt (GW) of combined capacity. The two wind farms each supply more than 250 megawatts of electricity and are critical to Cemex's cement production, but they also need data for efficient operations.
“The performance of the wind turbine generators directly impacts our main KPIs—technical, contractual and economic—so our challenge is maximizing the resources available from the wind farm,” says Roberto Carlos Medrano, operations and maintenance manager at Cemex Energía. “We also needed data management to support our energy operational platform in all Cemex Energía operations to enable a reliable asset management strategy to comply with all our contractual obligations.”
Consequently, Cemex Energía adopted OSIsoft's PI System, Web API, Connected Services and OPC data communications. "PI and Connected Services create a specialized and reliable analysis tool that measures deviations in real performance compared to warranty performance behavior in real-time,” says Medrano, who adds the wind farms use PI Server's four main features—Asset Framework (AF), Event Frames, notifications and high availability—to improve turbine operations by implementing simpler operations presentation on a few simple screens.
For example, Cemex Energía employs PI System combined with a Google Maps image to produce a live-action overview of the wind farms, which shows the running status of the turbines in real-time, any faults, and operations, including daily and monthly production, ambient temperature, capacity and wind speed. Operators can double click on the graphic for individual turbines to view details; use AF navigation to move between turbines; view an alarm panel page; and drill down to circuit breaker details (Figure 3).
Figure 3: Cemex Energia is using wind power to generate electricity for making cement with help from OSIsoft's PI System, Web API, Connected Services and OPC data communications. PI even combines with a Google Maps image to produce a live-action overview of Cemex's wind farm, showing the status and operations of turbines in real time. Source: Cemex and OSIsoft
“Without AF I think this could not be possible,” adds Medrano. "We have a lot of turbines, which means we need the capabilities of AF and its templates, so we can create one template, and spread it among all the turbines."
In addition, Cemex Energía has also created detailed event and operational spreadsheet reports using PI Server's Event Frames and PI DataLink. The reports track key metrics for each turbine and help users analyze failures in context. “This lets us focus on the assets with high frequency of failures to immediately establish action plans with the operators,” adds Medrano.
To get an even clearer picture and further improve wind farm operations, Cemex Energía has also created notifications for turbine trips, circuit breaker operations, communications issues, wind sector management, ambient temperature above safe operation, reactive power, data freezes and power curve performance. These notifications show personnel not only what's happening in the turbine, but what actions need to be taken to correct issues. “The result is the avoidance of lost production due to inefficiencies in turbine performance,” he says. "We're counting every minute, every second about how the turbines are operating.”
In the future, Cemex Energía plans to integrate more electrical parameters from its cement plants into the PI System, so it can manage grid compliance, as well as evaluate PI System for future metering projects.
Wireless rides shotgun
Of course, once Ethernet began popping up in process applications, it was closely followed by related wireless formats such as WiFi and Bluetooth, which joined the radio, cellular and satellite technologies that were already widely deployed. And, because Ethernet paves the way for the Internet, IIoT can also expand via wireless.
Likewise, Kraft Heinz Co.'s plant in Champaign, Ill., has long used PLCs to manage operations and raw material/product refrigeration, but it didn't have enough sensors with real-time monitoring, notifications and alerts of out-of-range temperatures, and often resorted to paper chart recorders that had to be manually checked. As a result, Kraft Heinz engineers recently installed matchbox-sized, battery-powered wireless temperature sensors from Swift Sensors, which send data to a wireless bridge that relays it via secure Wi-Fi to a cloud service for uniform display on a common dashboard available on PCs and mobile devices.
Better temperature monitoring and improved productivity encouraged Kraft Heinz to add wireless vibration sensors to motors on its pumps and compressors, and provide real-time alerts to quality assurance and maintenance staff about changes in their performance without altering existing equipment. The latest wireless device deployed at the Champaign plant is Swift Sensors' 4-20 mA sensors that are linked in a 4-20 mA current loop from RTD sensors used with the PLCs. The 4-20 mA sensor let users pull data from existing transducers and sensors connected to existing PLCs using the standard 4 - 20mA current loop, and this delivers real-time updates from all sensors in the plant to one cloud-based dashboard viewable on PCs, tablets or smartphones (Figure 4).
Figure 4: Wireless, 4-20mA sensors deployed at Kraft Heinz's plant in Champaign, Ill., are connected in a 4-20 mA current loop from RTD sensors and PLCs controlling operations, refrigeration and motors. They deliver data to a bridge and cloud service that displays it on a common dashboard, which users can view on PCs, tablets and smartphones. Source: Kraft Heinz and Swift Sensors
Kraft Heinz reports automatic data logging and reporting by its new wireless network reduced its return on investment (ROI) to just five months, while immediate notifications have aided its food safety and saved hundreds of thousands of dollars of raw materials and finished products.
Sam Cece, founder and CEO of Swift Sensors, reports that users can deploy its 50 wireless, IP-enabled sensors as needed, display their information on the same dashboard, and even integrate video from Eagle Eye security sensors. "IoT can help as users deploy more temperature, humidity, 4-20 mA, water presence, electricity, vibration and other sensors," he says. "But, as they also generate more unstructured data, our interface can present it all in the same way, allow thresholds to be set for each, overlay and combine their data for better insights. This replaces some former tribal knowledge with sensor data and thresholds shown on smart phones. This isn't talked about as IoT, but that's what's happening."
Seeking security
Despite its considerable and ongoing impact, one major snag holding the Internet back from being applied more widely in process applications and other industries is its perceived and often real lack of cybersecurity. And, even though perception and reality aren't the same, they still put the same restrictions on using IIoT to aid and optimize process applications.
"The primary, potential risk in using IIoT and mobile technologies in process applications is addressing and implementing security. It’s very easy to quickly build systems without regard for security measures, and for this reason, many devices, controllers and systems aren't adequately protected from intruders or otherwise bad actors in a networked system," says Opto 22's Hougland. "That said, plenty of security tools and architectures are freely available to implement and protect process systems. However, because it is a choice and not a requirement on the part of the architect of these systems (whether end-user, integrator, OEM or others), these security tools are often not implemented correctly, or in many cases, not implemented at all. This is due in part to the fact that security tools and architectures have principally been part of the IT domain, and not the OT domain. In many cases, this boils down to knowledge and training."
Because the classic process automation model is a PLC with I/O points reporting to it, Grantek's Hamilton adds that IIoT really starts where ISA-95 Level 0 devices become network connected and add intelligence otherwise not available, though they bring vulnerabilities that must be addressed. "The benefit of IIoT is that it can achieve much deeper visualization into process applications," says Hamilton. "When a basic temperature sensor adds a chip, intelligence and the Internet, users can get many more and different indications. The sensor can tell when it was last calibrated or if there are any anomalies, and it can trend data. Previously, only one piece of data was available because of its analog output. Now, devices with chips are much more flexible, but they're also vulnerable. Smart sensors can have their code modified to give bogus data or become part of denial-of-service (DoS) attacks, so users and their security teams need to work with security-aware system integrators and vendors to follow defense-in-depth, least-privilege and other security policies."
Read more: Primary IIoT players
Hamilton adds that every vendor wants to get onto the IIoT, but end-users usually only ask if they can get the functions they want, and how cheaply they can get them. "Users and vendors need to give themselves a little more space to follow security best practices," he says. "They must implement encryption and mutual authentication to make sure their devices are only used in the fashion intended and can access only what's needed to do it. This means employing PKI or PGP cryptography, so during sign in, you send me a message that I can use to validate you and vice-versa. IIoT devices are proliferating rapidly, but they must follow procedures that make them not just functional but secure as well."
Boost to cloud = better analytics
One essential job that IIoT can assist with is serving information to cloud-computing services, which in turn enables more and better analysis by its users—all with fewer of the hurdles and headaches that impeded similar efforts in the past.
Notably, GE Digital has launched its Predix Manufacturing Data Cloud (MDC) to consolidate and transform manufacturing data across plants for enterprise cloud storage and analysis. Used in concert with a traditional MES, Predix MDC gives manufacturers operational analysis in the cloud and greater deployment flexibility, which shrinks on-premises systems, allowing them to run more efficiently.
"Most companies are only scratching the surface of realizing their data’s potential. In fact, today manufacturers are losing the value of 70% of collected manufacturing data,” says Matt Wells, vice president of product management for the digital plant portfolio at GE Digital. “Predix MDC delivers the power of cloud computing to manufacturers, taking the burden of heavy computing loads out of the plant, enabling users to aggregate data from across the business and run analytics that can uncover new insights and unlock greater efficiencies.”
Predix MDC also gives cloud functions to GE Digital’s Plant Applications manufacturing execution system (MES), which tracks and analyzes production execution and workflows, and manages production data and quality control. Predix MDC gives users a secure, reliable way to ingest and store data in the cloud, accelerating process implementation by as much as 50%, and giving users richer views of data captured at individual sites. It also runs analytics and comparisons across various locations and data types, including manufacturing, enterprise and asset data. Also, MDC can reduce on-premise storage and maintenance costs, freeing up computing systems to focus on core capabilities.
To extend its own access and analytics efforts, Microsoft has launched several "intelligent industry innovations" including: Azure cloud-computing service adding end-to-end IoT security for devices, hubs and cloud resources; OPC Twin and OPC Vault joining Azure IoT Connected Factory service to give users a digital twin of their OPC-UA-enabled equipment to also enhance security and certification management; and Azure IP Advantage expanding to users with IoT devices connected to Azure and devices powered by Azure Sphere and Windows IoT.
"We're taking engineering work out, so users can achieve their data-driven culture, and have all their data available to them regardless of where they're sitting," says Çağlayan Arkan, global lead, manufacturing and resources industry, Microsoft. "These solutions will let any manufacturer do IoT with two-way communications, and start streaming devices worldwide, including controls. For example, we're providing an OPC twin that lets users browse, write instructions back to a digital feedback loop, manage digital devices easily with dashboards in an Azure IoT hub, and secure them end-to-end with Azure Security for IoT audits and potential threat detection."