1903-feat3-lead-compressor
1903-feat3-lead-compressor
1903-feat3-lead-compressor
1903-feat3-lead-compressor
1903-feat3-lead-compressor

Modern data historians provide for optimization, savings

March 15, 2019
Data analytics programs optimize water injection completions, and produce huge savings at two oil and gas extraction applications.
Learn more about modern historians

Find out more about modern historians and data analytics in this episode of the Control Amplified podcast featuring Will Aja, customer operations VP at Panacea Technologies

Sam Russem, director of the Smart Manufacturing practice at Grantek Systems Integration, expands on the conversation in Part 2. 
Shed a few pounds. Jettison some physical constraints. Both allow greater speed and freedom of movement, and this is just as true for process control components as it is for the people and processes that rely on them.

In the case of process historians and other data storage devices, as they've migrated from boxes onto faster microprocessors and fully software-based formats, they've also accelerated and diversified until it seems almost any component or chunk of code can be set to gather and archive data beyond the initial production cycle of its application. This has generated oceans of data, of course, until the new challenge is extracting and analyzing the right information that can enable better decisions—and turnaround data and decisions fast enough that they basically become part of the control loop.

"Ten years ago, single historians were often installed with one specific purpose in mind, and they'd be installed with a dedicated server for that one use case. Now, historians are more distributed and collect data from various shop-floor applications for aggregation to a facility and the entire enterprise," says Sam Russem, director of the Smart Manufacturing practice at Grantek Systems Integration, a CSIA-certified system integrator in Philadelphia. "It helps to think of the modern historian as a feature for long-term data storage and analysis, whether it's a small device on the plant floor or aggregating for larger systems on the enterprise level. The two basic types are: time-series historians that aren't relational, but are fast because they only do time stamps and values; and relational databases such as Microsoft SQL that store historian data in tables, rows and columns that follow a standard recording-and-retrieval interface.

"We recently worked with a power company that needed to monitor 30-50 solar sites across the U.S., and had a small, in-chassis, high-resolution historian at each site. These historians needed to support local data buffering for up to two weeks, and forward that information to an enterprise historian instance to accommodate unreliable Internet connections at the site. We designed a uniform system with a common network, and updated individual sites with Cisco switches, serial and redundant fiber rings onsite, and landlines or cellular as needed. This let them use predictive metering to plan how much power to generate compared to other sources, which made it easier to throttle their system, balance their output and satisfy regulatory reporting."

Surveillance for savings

Marcus Coleman, business relationship manager at Aera Energy LLC in Bakersfield, Calif., reports its 22 x 2.5-mile Belridge complex in Kern County produces close to 80,000 barrels of oil equivalents per day from about 7,500 production wells and 5,000 injection completions. The complex uses light oil water-flooding to maintain well pressure, which requires operational surveillance of hundreds of completions per operator, who monitor flowmeter fouling, PID loop tuning analyses, failed surface control alerts, measurement analytics and pipe health dashboards.

Aera historically used manual monitoring and screw-in gauges on Belridge's water injection distribution lines, but these were limited to mainly reactive troubleshooting and responses that were of minimal value. More recently, it began adopting exception-based reporting using software developed in-house. This solution pulls data from multiple sources, creates function-based services and provides exception alerts, which allowed more proactive responses. As part of this effort, it also adopted PI software from OSIsoft as its high-resolution historian in 2014, and Coleman states the adoption started out with some small, meaningful wins.

Saving with surveillance

Figure 1: Operational and engineering surveillance with OSIsoft's PI help optimize 5,000 injection completions in the light oil waterflood program at Area Energy's 22 x 2.5-mile Belridge complex in Kern County, Calif. The software uses pressure curve data to verify pipeline clearance and health, and provides wellbore integrity alerts, well interaction analysis, production allocation measurement dashboards and pressure transient analysis. Source: Area Energy and OSIsoft

"We started out using PI for small pilot projects and production wells, and it let us collect higher-resolution data that enabled advanced analytics and exception-based signals," says Coleman, who presented at ARC Advisory Group's recent Industry Forum 2019 in Orlando. "For instance, we have hundreds of miles of pipe for injecting water to maintain pressure. However, when the pipes get dirty, the pressure drops, so we have to model what they should be delivering. As a result, we added OSIsoft's PI Vision software with CoreVision. This lets us see all our distribution pipelines and their health at a glance onscreen.

"In the past, our manual gauges and their data would change before our personnel could even get back to the office. Now, PI shows us what the pressure design curve should be and what it is, which tells us if a pipe is clear within 10 minutes at the start of a day, so we know what needs to be cleaned that day. This creates huge efficiencies, and keeps the water injectors and oil-producing wells online." The engineering surveillance program using PI at the Belridge complex also includes wellbore integrity alerts, well interaction analysis, production allocation measurement dashboard, and pressure transient analysis (Figure 1). Even more gains were possible, though they'd require some added collaboration and a cultural shift.

Breaking out of storage

Of course, once historians grew beyond their traditional data archiving role, they also began acquiring some new nicknames, such as "data analytics tool" or "data infrastructure."

"The big thing we're seeing is how clients are classifying what a historian is instead of just software," says Will Aja, customer operations VP at Panacea Technologies Inc., a CSIA-member system integrator in Montgomeryville, Pa. "Where they previous threw data in and ran reports later, many are classifying historian data akin to a utility like electricity or chilled water, which means they're thinking of them and treating them differently. Also, where historians were formerly only used for critical data collection, users can now afford to collect all the data they want, which raises the question of do they need to collect everything? This is also changing the way process applications are designed and physically coded because reports can be pulled and cleaned more easily, while real-time data aids modeling and analysis by high-level calculation sets. All of this means today's historians are more about using data rather than storing it."

Innovative historian applications

Chris Nelson, software development VP at OSIsoft, reports that, by evolving from a storage system to a system that can synthesize data into information, historians become a key part of operations.

"It doesn’t control devices; it lets you better understand operations, so you can make the right control decisions," says Nelson. "One example is Air Liquide. They use the PI System for some classic applications like predictive maintenance, reducing downtime and process optimization. Some of these tasks are performed directly on the PI System. In other situations, PI is used to cleanse and organize data, so that it can be used by a third-party algorithm.

"More recently, through a wholly owned subsidiary called Alizent, Air Liquide has begun to market plant performance services to its customers. In other words, Air Liquide is leveraging its own internal capabilities and turning it into a revenue stream."

Nelson adds that Uniper, a large German utility, is doing something similar. Based on the PI System, the utility developed a predictive maintenance application called Tiresias, and is seeking ways to commercialize it.

"Again, this is all possible because the historian is not functioning like a vast deep freeze for machine data," says Nelson. "It’s a live feed into operations served up in ways that people can understand. They can also use it to track performance over time to make sure the results match the intended outcomes.

"Another example comes from Australia Gas & Light, which grew from a capacity of 300 MW to 10 GW in nine years, but didn’t have much data visibility. It started encouraging engineers and even non-technical employees to build consoles, and created a central diagnostic center. They invested $1.2 million in software, and were able to recover $21 million in three years through reduced downtime, reduced curtailment penalties, reduced repair costs and improved power output. The PI System also detected a generator flaw that could have cost $50 million. David Bartolo, who runs innovation at AGL, adds they're in a situation where 'anyone can become the next great data scientist.' ”

Wayne Matthews, divisional director of Yokogawa Marex, a division of Yokogawa U.K. Ltd., adds: "The value of historians lies in their ability to make process information accessible from any location and level of a large organization by combining data of a heterogeneous nature, such as process, events and recipes, based on context information. Historians offer additional flexibility to build complex data and application architectures with one software platform, where components can be deployed in the cloud or on-premises in the most effective way. Process historians are an integral part of digital transformation, helping to automate and streamline business processes and bring new life to digital data.

"Modern process historians such as Yokogawa’s Exaquantum also provide a platform for delivering operational value from the wealth of process automation systems using standard SQL structures and open interfaces. They provide the foundation to easily create and develop an impressive suite of modular applications tailored to meet specific customer needs through deployment in physical, virtualized or hosted environments on premise or in the cloud."

Chris Nelson, software development VP president at OSIsoft, adds: "The historian has gone from a place of storing data to becoming a data infrastructure, which is a distinct layer in computing architecture used for data management and data-driven decision making. The key moment in this evolution occurred when data storage and management companies such as OSIsoft began to shape and organize TSD, so users could see and understand it in context, and people could quickly act on it. With a historian, a pump might be 27 different, distinct data streams. In a data infrastructure, they're organized so users can see the interactions and quickly understand how subtle changes in its performance can have big impacts downstream."

Nelson reports OSIsoft's first step in this area was a technology called Asset Framework to which it added visualization. "Today, this technology is better known as a digital twin, but it’s been percolating for years," explains Nelson. "What makes this moment pivotal is that it changes the relationship between users and the data coming from the system, and turns it into actionable data. Now, people can conduct analytics, perform plant optimization, develop energy efficiency strategies and other tasks on what used to be known as historians.

"The challenge is we really don’t have a new, pervasive name for it yet. We call it a 'data infrastructure.' You could say historians are becoming IoT platforms or at least a large part of the layers that constitute an IoT platform. Or you could call them data management, visualization and analytics platforms."

Crossing the streams

Thanks to the flexibility granted historians by their integration with other manufacturing applications, the data they handle can also be combined, analyzed and molded in more helpful ways.

For instance, Devon Energy in Oklahoma City, is an independent oil and natural gas exploration firm that produces about 250,000 barrels of oil, 1.2 billion cubic feet of natural gas, and 1,000 barrels of natural gas liquids per day. These operations and its multi-billion-dollar capital drilling program generate huge amounts of data, but until recently, the company had a difficult time finding a analytics program that could coordinate all of its data and provide useful intelligence, according to Devon Koetter-Manson, completion engineer at Devon, who also presented at ARC Industry Forum.

"We've seen a data infusion over the past few years from people, hardware and software," says Koetter-Manson. "However, our advanced analytics efforts focused largely on sexy, subsurface topics like seismic, reservoir and other geologic issues.  We often overlooked using data analytics for surface operations, even though they had the potential for greater impact."

Because its operations produce huge amounts of wastewater along with oil and gas, Koetter-Manson reports that Devon must reuse or dispose of it using storage and settling tanks, filters, injection pumps and disposal wells into old, depleted oil zones. To help manage these applications and equipment, the company has been using PI historian software, but recently added Seeq Corp.'s to show their data streams in conjunction with each other for more detailed performance indicators and even better decisions.

"This wastewater has a lot of small, hard oil particles in it that we filter out. This quickly clogs the filters, but so we can keep on top of it with the historian's combined flow rate and differential pressure information," says Koetter-Manson. "Seeq also pulls hidden amperage trends, finds comparables in the raw data, and give them to us 2D plots that are understandable."

In addition, Seeq provides storage/settling, filter, injection pump and disposal well data in combined and side-by-side charts that show the health of Devon's entire system (Figure 2). These multi-platform results can indicate issues that need to be addressed effectively than each individual platform could do on its own. "We're also using Seeq to spot deviations from normal by identifying the small data points that can identify them."

Data streams add up

Figure 2: The storage/settling, filter, injection pump and disposal well system that Devon Energy uses to process wastewater from its oil and gas wells use Seeq to deliver data in combined and side-by-side charts that show the health of its entire system, and to present multi-platform results that can indicate issues that need to be addressed more effectively than each individual platform could do on its own. Source: Devon Energy and Seeq

Michael Risse, VP and CMO at Seeq, adds: "There are far more systems out there collecting data than we think. For example, every SCADA system has some kind of historian, and there are an increasing number of ways to store TSD in historians, cloud services, data lakes and open-source offerings. Plus, there are more options now where TSD can reside, such as on premise, on the edge, or in the cloud with a software as a service (SaaS) application. All of these options mean that TSD storage will become more affordable than in the past.

"The challenge is maintaining connections and access to data sources and stored data, and finding the right tags that you need. Seeq's average customer has 30,000 to 70,000 tags or sensors, so it can be a problem for users to find the right ones, and link them to their applicable assets and context to make sense of them. This is why Seeq isn't a store for TSD, and instead focuses on creating connections and enabling advanced analytics on TSD."

Teamwork aids analytics

Back at Aera's Belridge oil field in California, Coleman reports this pipeline health program was one of two analytics initiatives Aera implemented using PI. The other involved cycling high-pressure wells on and off as part of their injection strategy, which prevents overpressure and the potential for costly redrilling. "PI lets us determine downtime pressure in the well by using a pressure transmitter at the surface," explains Coleman. "Our effectiveness in managing high-pressure wells improved with PI, which is useful because we get a lot of exception alerts. 

"We also connected PI to our in-house, exception-based system and source data, and this is how we really began to communicate between our work groups and automate them. For example, PI might begin by reporting a metering issue at a well, but now it could also let the engineers know they had an issue in the field that could reduce injection. PI was getting people to focus on the right tasks by enabling direct, precise communication between engineering and operations. Our success really began to take off when we connected people, expanded their capabilities, and gave them real-time views for their decisions. The secret sauce is connecting the right people with the right tool."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control.