Leveraging big data to streamline plant operations

‘Actionable intelligence—the goal of insights and analytics—is still a goal after 20 years,’ says Michael Risse, vice president, emerging markets, Seeq Corp., at this year’s HUG conference.

By Paul Studebaker

Back when I was a process engineer in a magnet factory, we almost always had at least one of our many product lines that wasn’t meeting specifications. When the magnets got to final inspection, we would magnetize and test a sample group, and they weren’t good enough magnets. So we were constantly engaged in witch hunts, running around analyzing compositions, inspecting equipment, talking with operators, etc., to see what wasn’t as it should be.

The heat treating department would point fingers at the foundry, the foundry suggested the heat treating furnaces needed calibration, and everybody would say it all started when purchasing changed the source of aluminum, nickel, mold sand or pixie dust. We’d try a lot of things, and sooner or later, the magnets would get better, though half the time we couldn’t be sure it was because of something we fixed. We just didn’t have enough factual information.

I’m reminded of those interesting old days by our response to U.S. mass shootings, where every new incident brings out the same calls for gun control, arming potential victims, mental health screenings, bombing terrorists at the source, stopping immigration, monitoring by religion, etc. It seems everybody has a silver bullet that, coincidentally, is pointed at somebody else who they didn’t much like in the first place.

While the public reacts and politicians pontificate, investigators comb the perpetrators’ activities and background, including phones, computers, communications, social media—anything that might help them understand their motivation, as well as find and apprehend any co-conspirators. We can assume every bit of information becomes part of a profile used to build a knowledge base and refine a set of algorithms designed to find and bring attention to potential terrorists before they act.

The utopian vision for Big Data, the cloud, and the Industrial Internet of Things (IIoT) would have us imagine a similar approach for your plant. Powerful analysis packages will detect bad actors among equipment, materials and operators; notify decision-makers, and perhaps even recommend or implement a corrective action before quality is compromised or something blows up.

Get the special report: A secure foundation for digital transformations 

And we may get there. We may even learn how to defuse the radicalization of potential terrorists before they act, maybe with cheerful Facebook news feeds or links to cat videos.

Meanwhile, back at the plant, we could make a lot of progress with the process data we’ve been collecting for decades, if we could use it more effectively. The recent Honeywell Users Group (HUG) conference in San Antonio took a dry-eyed look at IIoT applications, including that much romanticized data analysis layer.

“Actionable intelligence—the goal of insights and analytics—is still a goal after 20 years,” says Michael Risse, vice president, emerging markets, Seeq Corp. “It still hasn’t been delivered.”

Risse says that while we’re waiting for all-powerful algorithms, we can use human expertise to prevent and solve problems. “Someone has to be the expert. Why not the engineer with the expertise?” he asks. Honeywell has partnered with Seeq to empower its Uniformance Insight analysis software with Seeq Workbench, which brings data from real-time and transactional sources together in a single view.

It replaces Excel spreadsheets and augments historians with data from virtually any source, easily configured for the most meaningful views and sharing with others via the cloud.

“The engineer doesn’t need to know or care where the data is,” says Mike Brown, Honeywell global director, advanced solutions. “You can combine data from three sources in real time to identify problems and opportunities.”

It would have been nice to have that at the magnet factory.

Show Comments
Hide Comments

Join the discussion

We welcome your thoughtful comments.
All comments will display your user name.

Want to participate in the discussion?

Register for free

Log in for complete access.

Comments

  • There are software products out there making inroads to this process analysis and using some very smart methods. The team over at PPCL lead by Robin Brooks use GPC to track operating envelopes for up to about 1,000 variables using their CVE and CPM packages. The team over at TrendMiner from D-Square have a PI historian add-on that looks at process patterns from previous best operations and golden-batch operations to act as a supervisor for multiple variables and alert operations to deviations that could lead to out of spec products. These are both worthy candidates for usable tools now for the so called Big Data analytics.

    Reply

RSS feed for comments on this page | RSS feed for all comments