My first years in the magnet factory were in the R&D department, where three of us worked on new products and processes. We had miniature versions of production melting furnaces, powder mills, presses, sintering and heat-treating furnaces, etc., all in a space of about 3,000 square feet that we shared with quality control. We were trying to leverage the findings of university papers and industrial patents, repeating and building on their work with an eye to making improvements and getting patents of our own.
At the time, the physics behind the magnetic properties of elements and alloys was strong in theory and sketchy in details, so new and better magnets were made by trying stuff: different compositions, powder and grain sizes, heat treatments, etc., using Design of Experiments to find, then optimize families of magnets such as iron-chrome-cobalt and neodymium-iron-boron (NdFeB) to add to our catalog.
In the 1980s, the science got better and computers allowed physicists to hone in on likely new magnets using first principles, but even then, the real world would not cooperate if for no other reason than the purities of research-grade materials just aren’t available in industrial quantities. And if they were, we’d contaminate them anyway.
So, the optimization we did in the lab became just a starting point for production, where different equipment, batch sizes, furnace temperature profiles, etc., meant we had to find the most robust process conditions all over again.
Maybe if we’d had today’s comprehensive data collection, advanced analytics, some clever apps and a few YouTube videos, we’d have been able to shorten the R&D process. But those tools are mostly useful only if someone already knows how to do it. They’re good for promulgating and institutionalizing knowledge, but not the way it’s usually discovered.
With the focus we have today on spreading information around (and that’s extremely valuable), we tend to overlook the importance of developing it in the first place. So, I was pleased to attend a session at the recent Emerson Global Users Exchange on how Fisher Advanced Research is working to understand vibration failures of valves, piping and equipment.
Fisher researchers have instrumented a reconfigurable lab test rig with numerous vibration sensors and decibel meters to record frequencies, intensities and sound levels, and correlating them with damage to valve components, piping and fittings using analytical software. They’ve learned some things about how operating conditions cause valve element damage, tubing failures and cracks in pipes.
But they acknowledge that the lab research is not being done on real-world systems, and they need to correlate it with results from running plants. To that end, they’re working with Absolute Energy on a problem with repeated fatigue crack failures of a diffuser in a steam let-down system for an evaporator feed.
Absolute tried changing the diffuser, using different materials and relocating components, but the cracks recurred. A vibration consultant did an analysis and recommended the usual fixes—to add mass for dampening, and to stiffen the structure—but those remedies are not practical on the elevated structure.
Fisher provided instrumentation, did an analysis based on its lab findings and is close to recommending a solution. Meanwhile, using the Fisher data, Absolute has correlated the cracking with operating the plant closer to its highest energy efficiency, so it’s preventing the failure by allowing the facility to run less efficiently.
As plants push processes to maximize productivity, quality and efficiency, more may run into valve and equipment failures due to higher stresses. If they do, Fisher invites them to become test cases for its vibration algorithms.
Real knowledge is built bit by bit. It’s great to see it happen, and even better to be part of it.