1660318435660 Ct2005coverpt2hero

Albermarle finds data insights need engineers

May 14, 2020
Part 2: Chemicals manufacturer reports people are crucial for giving analytics software crucial context

To cash in on its increasing speed and expanding capabilities, data analytics requires participants to understand networking, Internet, cloud-computing and other digitalization formats. However, it's equally crucial for them to incorporate the know-how of process engineers and other plant-floor professionals.

"In the past 10-11 years, we've seen industrial users transition from looking backward at past problems to learn why a product didn't meet specs to now using analytics to look at what their processes are doing now and provide early warning signs about what they'll do in the near future," says Peter Guilfoyle, CEO at Northwest Analytics, which has been offering manufacturing analytics software and services for more than 30 years. "The smartest companies have figured out if they can run their processes smoother and without stressing their internal systems, then they'll have less wear-and-tear and run longer. Adjusting processes based on analytics, not on hunches, means more consistent production and greater profits."

Jonathan Alexander, manufacturing data analytics leader at specialty chemicals company Albemarle Corp. in Charlotte, N.C., reported during a presentation at Northwest Analytics' Manufacturing Leadership Forum 2018 that his company has been on a five-year journey to get free of its traditional reactive mindset and repetitive problem-solving to apply analytics to its operations in real time to maximize value and impact. The Manufacturing Leadership Forum delivers real-world insight from those who have successfully implemented digital transformation programs using manufacturing analytics. (For more information or an invitation to the next event in Houston, email [email protected].)

"We had the basic Six Sigma goals of reducing variability, being more consistent and reducing defects—even if we're still in spec. This gave us some added wiggle room and flexibility to better handle future process shifts and rate changes. Operators don't need to be data scientists or statisticians. They just need to know if a decision is right or wrong. As a part of our big data overhaul, we utilized the laboratory information management system's (LIMS) quality data for one of our products, and leveraged statistical process control techniques to simplify them for making right-or-wrong decisions. Just doing this had a dramatic change on our process and saved time."

Alexander reported that Albemarle then began using real-time, auto-updating, statistical process control (SPC) charts to push for tighter control limits, allowing operators to respond more quickly to fix potential problems before violating specs. While looking at its LIMS data on viscosity and the influence of pressure, temperature and flow, they began to put data points through contextual filters to make sure they were meaningful and useful. In addition to LIMS data, Albemarle incorporated data from their OSIsoft's PI Data Historian to identify variable shifts, trends and issues from shift-to-shift that might need to be addressed. Previously, these were displayed on dozens of complex graphs. What was needed was an approach that simplified the graphics and flowcharts to enable operators to respond faster.        

"By applying advanced analytics, we learned we didn't really know everything we thought we knew about alarm causes. It was a bit of culture shock, but we kept at it, rebuilding our fundamental process knowledge, and made our way back to greater process knowledge over time," explains Alexander. "Our analytics-based alarm dashboards plotted 1,325 real-time control charts in one dashboard at one site. However, we only had about 10 alarms every day. By minimizing the alarms, we could then spend more time identifying the root cause, without being overloaded with data. The results of this analytics-based approach are increased productivity, quality, process knowledge, employee development, equipment reliability and cash.

"When we first implemented manufacturing analytics using Northwest Analytics' Focus EMI software, we slowly but surely began reducing variations in product quality at multiple sites across Albemarle due to day-to-day problem solving with real-time data. After six months with the pilot project, one site went 18 straight months with zero out-of-spec materials. After an equipment failure briefly took us out of spec, that site operated an additional 18 months with zero out-of-spec product. For Albemarle, that was the first time in 40 years we were able to do that, and that one example saved $650,000 on blocked stock."

Alexander adds that data analytics dashboards can be used in a variety of ways including: maintenance for checking pump vibrations, motor amps and scrubbers; at the executive-level for evaluating overall equipment effectiveness (OEE), yield and availability; and by safety teams for tracking near misses, audit findings and injury predictions. The typical approach at Albemarle to applying analytics includes installing software, getting IT involved to connect to different databases, organizing and analyzing data, collaborating closely with plant-floor coworkers to secure buy-in, designing dashboards, and implementing an agile process improvement approach that works to creates sustainable habits.      

"We had a customer that needed a product with a higher purity level. Using our real-time analytics platform, we were able to fulfill their request and secure the business," adds Alexander. "We're also using analytics to discover new operating ranges in areas like reducing raw material utilization, which has saved about $500,000 per year. While our initial goal was to reduce variation, we found there are so many additional opportunities for improvement. By providing the right type of visibility, we were able to empower people to make the right changes at the right times."   

Guilfoyle adds that checking for what is not normal, reporting and managing by exception, and basic univariant analytics have been available for 80-90 years. "Even most machine learning algorithms aren't new. What's new is the technology for accessing and packaging analytics, as well as the renewed interest in analytics, and users waking up to the potential in their data to drive operational and competitive excellence," says Guilfoyle. "But there's no 'easy button' when it comes to analytics. There are better connections and networking technologies, but they don't deliver value by themselves. They can only succeed in combining data analytics with subject matter experts (SME). Some users want to plug in software and be done, but questions arise, and multi-variate models must be updated by someone due to changing conditions. This is why people are essential, and why most manufacturers are very skeptical of any supplier who says different. The magic happens when SMEs, analytics tools and data get together."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

Measurement instrumentation for improving hydrogen storage and transport

Hydrogen provides a decarbonization opportunity. Learn more about maximizing the potential of hydrogen.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Learn About: Micro Motion™ 4700 Config I/O Coriolis Transmitter

An Advanced Transmitter that Expands Connectivity

Learn about: Micro Motion G-Series Coriolis Flow and Density Meters

The Micro Motion G-Series is designed to help you access the benefits of Coriolis technology even when available space is limited.