1660238328261 Jimmontague0609

Lather, rinse, repeat data conditioning

May 18, 2021
There's more than one way to clean, organize and contextualize information, so it can be analyzed quickly and more thoroughly, and lead to better insights and decisions.

Gathering, storing and preparing data for transmission, access and analysis by other users and applications can involve many different steps. These can also vary with the performance profile of the initial process and according to the different needs of the analysis that users want to conduct. However, there are some common threads and requirements. Here are some of the most frequent tasks:

  • Define one more production problems that could benefit from improved analysis to help define and direct the search for  the most appropriate data analytics solution.

  • Enlist an internal team and external system integrator, expert supplier or other partners to develop data analytics policy, procedures, requirements, specifications and a schedule for implementing them.     

  • Identify existing and expected information sources and files that need to be analyzed, including their locations, data formats and specifications.

  • Assess how data is gathered from sources, entered, handled, communicated and stored locally, in historians, on servers, in the cloud or elsewhere.

  • If signals, parameters and scheduling data are coming from multiple sources, check if they need to be pre-analyzed or coordinated, or if gaps need to be filled in before they're sent for further analysis.  

  • If a process application has an existing historian, determine if planned analytics software can automatically interact and collect data from it, or if some added capability is needed.   

  • Decide where to perform analytics, either locally, in an on-premises server, or in a cloud-computing service. Balance benefits and costs of processing data-intensive applications locally and reporting by exception versus possibly not sending enough data to the cloud and perhaps missing crucial trends. 

  • Evaluate if any data conversion is needed, and if so, determine if a software driver or other middleware can be applied and run automatically, or if an external or manual functions needs to be installed.

  • Determine which networking protocols are used to move information from where it's generated to where it will be analyzed, remove any communication snags, and test that applied fixes have resolved these former hurdles.

  • If legacy sensors, instruments, I/O, PLCs and other components aren't plugged into any historian or network, plan to get them plugged in, or devise another way to rescue their stranded data with minimal expenditure of time and labor. 

  • Design, pilot, test and periodically reexamine data analytics program,, software, components and networks to determine if existing needs are being met or if new capabilities need to be added.  

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

Measurement instrumentation for improving hydrogen storage and transport

Hydrogen provides a decarbonization opportunity. Learn more about maximizing the potential of hydrogen.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Learn About: Micro Motion™ 4700 Config I/O Coriolis Transmitter

An Advanced Transmitter that Expands Connectivity

Learn about: Micro Motion G-Series Coriolis Flow and Density Meters

The Micro Motion G-Series is designed to help you access the benefits of Coriolis technology even when available space is limited.