Sometimes users already have just the right switches, cables and connectors they need for IIoT, but more often than not, they must upgrade or replace some sections to clear a path for the added signals, data and intelligence they want to acquire.
To better organize data and gain added insights from its 356 production sites worldwide, BASF’s Reliability Center wanted to adopt new manufacturing intelligence applications, but recently learned it had to install a new, integrated, industrial data platform to replace many existing point-to-point networks. This platform operates like the middleware that connects many business and operations management applications, but Dr. Michael Krauss, senior automation manager at BASF, reports there were some common barriers to getting it up and running. These include:
- Hardwired physical layers and intrinsically safe (IS) barriers at many chemical industry plants that make it difficult to obtain pure process values from the field to the control system.
- Lack of context for data due to, for example, coded tag names that are useful locally but not to remote users. Instead, descriptive tag names are needed that identify location, equipment and function, so remote users can identify assets and apply analytics. If naming differences persist, the overall data platform must be able to connect to multiple information sources, and map data to a uniform enterprise-wide naming convention, perhaps via an information broker.
- After data storage and tag names are standardized, information must be globally available via a broker that can identify and retrieve it from wherever it resides. Regional replication of data could speed up searches for geographically distributed assets.
- Overly varied local displays need the overall data platform to provide standardized dashboards to ease cognitive loads and learning curves.
- Deploying a global data platform requires users to address endpoint cybersecurity, user authentication, and complying with different national regulations. Scaling up worldwide also presents challenges due to application performance, real-time availability and data reliability, high initial and maintenance, and the difficulty of designing and implementing global standards for different business units with different needs.
Stand and deliver sources
Consequently, Krauss reports that BASF implemented a data platform from longtime partner Inmation Software (www.inmation.com), which specializes in industrial system integration middleware. Inmation was acquired last October by Aspen Technology Inc. and folded into its DataWorks business unit. This platform brings in and connects information from different sources, including legacy historians, spreadsheets and new sensors, and helps BASF analyze its data and give users improved intelligence.
More specifically, AspenTech Inmation provides real-time, bidirectional connections using single-port TCP/IP, which fits with BASF’s IT organization and system integration standards. The platform prioritizes compressed and encrypted data transport via BASF’s WAN VPN in close to real-time. Inmation provides a multi-layered information broker that decouples the company’s many data sources from its other data-consuming applications. BASF’s infrastructure and enterprise-wide deployments including data-driven dashboards are on a NoSQL MongoDB database. It can handle many data types, such as time series, text information, alarms or events, and can set up and replicate very quickly worldwide to accelerate access for local users. Beyond its core capabilities, the platform embeds technologies such as data-driven digital dashboards, HTML5 and streaming analytics.
“The Inmation solution is distributed over 50 computers and running at BASF sites on four continents,” says Krauss. “It serves dozens of facilities worldwide, and connects with hundreds of different distributed BASF data sources."
Coordinate on the data layer
Keith Flynn, senior product management director at AspenTech DataWorks, reports that so many data sources have emerged in recent years that many organizations don’t have the infrastructure to manage or gain value from the information they produce. “Most users could add a few sensors and get some vibration data, but what if you’ve got hundreds or thousands of assets?” asks Flynn. “Plus, many devices and their data aren’t standardized, so there are very few with common APIs, configurations and security. There isn’t one solution, so the IIoT market is flooded with technologies that can’t scale.”
To bring diverse data sources together, DataWorks used the preferred communications protocols of 50 sensor suppliers to develop a standard integration layer and operations data hub. This integration layer exposes the context of information from participating devices, so it can be compared and analyzed with data from other sources. “If a user has 1,000 source connections and 3 million data streams, DataWorks can concentrate and move that information to a cloud-based data lake, and contextualize and deliver it securely,” explains Dwaine Plauche, senior product manager at AspenTech DataWorks.
DataWorks starts each project by visiting the client’s development, qualification and production (DQP) sites, and using Inmation to digitally reproduce their assets and capabilities in duplicate or triplicate to create a redundant deployment architecture. Next, a binary version of Inmation is loaded onto the client’s central management console and central nodes, which allow access to the entire data layer.
“These nodes push software updates and patches, and also push soft connections that allow communication with the client’s DCS or other existing devices,” adds Flynn. “This method is an order of magnitude better. In the past, these configuration, integration, patching, maintenance and other tasks used to take dozens or hundreds of hours, especially when users had to go from machine to machine, but now they’re scalable and automated by software, and can be completed in minutes.”