Making Accurate Batches

Weighing Out Material Isn't as Easy as It Looks

Share Print Related RSS

By Walt Boyes, Editor in Chief

When you're dealing with the bulk transfer of powders and solids, it can be pretty easy to weigh out the material. When you're dealing with ingredients feeding into a batch recipe, it isn't all that easy. What you are trying to do in a case like that is to feed the exact amount of material in the shortest acceptable time. Speed and accuracy are not all that easy to maintain, because accuracy generally reduces speed, and speed normally reduces the accuracy you can maintain.

If you're transferring large amounts of material, the feed rate can be very high, with very little variation. It's when you start trying to measure and batch small quantities of ingredients that things get difficult and interesting very quickly.

It's the Spill, See?

The problem is that weighfeeders, or belt-weighers, or screwfeeders don't start with a complete load, and they don't stop having left just the correct amount to use in the recipe. They produce what is politely called "spill." It is sometimes called many other things, but what it is is the amount of material you have, over- or under-fed based on when your stop command is issued. Your stop command coming too soon, and you have under-fed material. If it comes too late, you've over-dosed the batch. Either can be a recipe for disaster in high precision batching applications, like pharmaceuticals. Even food batching can be problematic. Too much flour or not enough shortening can make really bad cookies. Too much yeast, or not enough yeast, can wreck a batch of bread dough.

There are a number of reasons why "spill" occurs. They include command lag, scale or filtering lag, and physical situations like the shut off valve not closing quickly enough, or the material that is still in suspension when the stop command is issued. There is also the dynamic force added to the scale reading until the material flow has stopped. The magnitude of this deceleration force is related to the speed of the feed.

Command lag is the delay between the time a cut-off command is issued and the time the feeding device actually shuts off. Command lags can have many components, including the architecture of the controller, program logic and scan and update rates of all of the parts of the control loop.

Scale lag is the delay in the reading, and the output, of the scale from the actual instantaneous value it should be reading. This can be caused by damping or filtering algorithms designed to minimize signal jitter coming from vibration, agitation, or other process components.

Then there's material in suspension. There's the portion of the material which has passed through the discharge port or valve, but has not yet reached the surface of the mixture. This amount of spill depends on the distance between the discharge valve and the material surface, and cannot be easily "zeroed" out because it can depend on a distance that can vary batch-to-batch because of the surface height of the mix.

Finally, there's the deceleration force…material in motion causes the scale to weigh either lighter or heavier than actual value until the material has stopped. This is another significant contributor to spill.

Speed Up, Slow Down!

Of course, the answer to most of these problems is to slow down the feed rate. Yet this is simply not practical in modern process control with its emphasis on batch cycle time and productivity increases. This is the reason that multi-speed weighfeeders are used in many applications today. But a multi-speed feeder cannot run at full speed for the entire duration of the material transfer, so there will be negative effects on batch cycle time.

In addition, multiple speeds mean gearing, transmissions, adjustable speed drives, and in some cases larger motors. This additional complexity increases maintenance and repair issues, and the expense of the device may not warrant its use.

This is why spill prediction is critical to fast and accurate control of feeders. There are many such algorithms, because developing one is not trivial. Many end user companies have their own proprietary algorithms, and most vendors of weighfeeders try to build some sort of spill prediction algorithm into their controller programs. A common method is to establish a fixed bias spill. In this method, a stop command is issued when the value of the weight, filtered for vibration, etc., is equal to the setpoint minus the expected spill. This doesn't always work, however.

How Do It Know? Designing a Batching Controller

There are several different kinds of batch transfer devices. A "Gain-in-Weight" feeder is a scale based system that batches material by measuring the gain in weight of the destination vessel. A "Loss-in-Weight" feeder is a scale based system also. It batches material by detecting the loss in weight of the source vessel. Flow meter feeders are what they sound like. They batch product based on the volumetric flow of the material through the meter. Each has its own drawbacks, as well as its own benefits. Since all three types are common in the batch industries, any batching controller needs to have the control functions necessary to handle each of the three types of feeder.

One of the most critical functions a good controller should have is a way to adaptively change the calculated spill value based on actual process conditions. Another, which is remarkably easy to set up in a ISA88-based batching controller, is abnormal situation management. Since the batch standard prescribes what is essentially state-based control, recognition of, are recovery from, abnormal operating states can be easily prescribed in the controller. This means that there should also be some built-in diagnostics for the weighfeeder and the entire material transfer system. Statistically based built-in fault detection and prevention facilities can be programmed into the batching controller to determine performance deviation from batch to batch, and adjust the controller accordingly, or shut the system down and call for maintenance automatically. Sometimes this is called a material transfer handler function, and includes error handling, fault tolerance, alarm management strategies, and status checks.

More modern batching controllers even have the capability of using advanced process control algorithms for adaptive and predictive target management. These can either be model-based or non-model-based predictive control algorithms, and they can learn the ability to minimize spill as the batch cycles continue.

Where Do the Functions Go?

Since the advent of ISA88, the batch standard, these functions are commonly built into the field controller. This is typically a PLC or PAC, although recently companies like Emerson, Honeywell and ABB among others have started including these batching functions in their DCS field controllers. Mostly these features are found in a batch executive in a batch execution engine inside whatever controller you're using. This is where all the phase, equipment module, and control module logic reside, and the controls for the weighfeeders should also be there. Companies that manufacture weighfeeders,such as Mettler-Toledo, among others have begun producing custom controllers with a batching controller built into the weighfeeder's controls. This permits a single responsibility for the weighfeeder and the controller, so the end user knows who to go find if problems occur.

In the case of Mettler-Toledo's Q.impact controller system, the predictive control algorithms came from a long research project at Procter and Gamble. Dave Chappell, a batch management guru who was part of the project when he worked at P&G noted, "We knew we had created a better mousetrap, but we were only able to support about 20% of the P&G deployment opportunities, so we begin seeking outside assistance. In 2001, P&G licensed our PAC algorithm to Mettler Toledo, which has successfully commercialized its Q.impact material transfer controller."

Other projects have included both dry and wet material transfer, including liquids. Emerson and Lubrizol have been working on designing a statistically-based batch predictive algorithm for fine chemical and lubricant production since 1985, for example, and several companies have been working to extend those designs to the pharmaceutical and biopharmaceutical industries.

As the cost of memory and computational power declines, we can look for better algorithms and smarter controllers making faster batch cycle times and more accurate batching and material transfer.

Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments