By Jim Montague, Executive Editor
Virtual reality is getting more like real reality every day.
Historically, simulation was the realm of NASA, aerospace, big oil-and-gas applications and other deep-pocket processes. However, ever faster and cheaper data processing is bringing simulation to masses of new users and types of applications in a seemingly inevitable progression from design and configuration, onward to training for routine and exceptional events, and more recently to optimization.
But why stop there? By bringing in more real-time data to make simulations more true-to-life, it soon becomes obvious they also can point out ways to improve an existing application's performance in the same way earlier simulations identify bugs and other fixes needed at the configuration and testing stages. Finally, as data processing keeps accelerating, moving information between a physical operation and its simulation is becoming easy and fast enough for some simulations to closely assist—if not actually run—some real-time operations. Simulation's next stop? Real-time process control.
Still, even though simulation is moving into new realms, this doesn't mean it's forgetting its roots in design.
"Modeling and simulation have already transformed military, nuclear, aviation, automotive and durable goods manufacturing, and now they're starting to transform our consumer packaged goods industry for the same reasons—it costs too much and takes too long to build physical learning cycles, and so the products they produce aren't innovative," says Tom Lange, Procter & Gamble's (www.pg.com) modeling and simulation director, who presented "Virtual Prescience" earlier this year at ARC Advisory Group's (www.arcweb.com) annual forum event in Orlando, Fla.
"In our modeling world, we build the first prototypes, and they fit, work and make financial sense. We do stuff before it exists in the real world because the computers we have now are faster than the fastest computers in the world just 10 years ago. All this computing power is allowing us to replace physical cycles with virtual ones and pursue realism," explains Lange. "So instead of building a model, doing some calculations, getting close to the physical experiment and securing some guidance, realism means making a model that's indistinguishable from the physical experiment. This requires using computing power for much bigger and more complex problems—doing parametric studies instead of point estimates. We're no longer interested in what's going to happen with one simulation. We want to know about the 64 designs around it or doing 128 runs of all the experiments around one finite element."
For example, physically testing mixing of liquids used to involve dumping in materials and just seeing where they went, but this real-world method didn't scale up from small models to larger tanks. However, simulation that includes computational fluid dynamics (CPD) can be easily scaled up, according to Lange. "Likewise, if we're trying to mix some dense, viscous fluids, they may not mix in a tank. If we did this experiment in reality, we'd now have to clean the tank ourselves. So using a simulation is more preferable because it can help us decide when to use a tank or a static mixer (Figure 1)," explains Lange. "And if we go with the static mixer, the simulation also can show how long it has to run, what its pressure drop will be and how well the material will be mixed. These are all questions that are entirely reasonable to be answered computationally. In their heads, people don't want to believe that simulations can be this accurate, but there are many times when the simulation performs better than the experiment."
Reaching Out with OPC
To better understand simulation's recent data processing gains and migration into optimization, we must look at how it is improving communications with their real-world counterparts.
"In the past, we used a simulator that our customer used. It had a fancy HMI package for graphics and software objects, but the problem for a system integrator like us was that it required a whole extra application step that we had to design, program and test, and this meant a lot of added time and money," says Ryan Gerken, technical director at E-Technologies Group (www.etech-group.com), a system integrator in West Chester, Ohio, which serves process and batch application users in consumer goods and pharmaceutical manufacturing. "So when we began a re-control project to migrate this same user's old batch process system from Honeywell's TDC 3000 DCS to Rockwell Automation's ControlLogix, we also needed a simulator because this production system runs at 100% capacity to keep up with demand. In this case, simulation can help us reverse engineer and work out problems without having to take down critical production lines."
During its search, Gerken reports that E-Technologies ran across Mynah Technologies' (www.mynah.com) MiMiC software, which is built on Microsoft's .NET and uses OPC-based servers from the OPC Foundation (www.opcfoundation.org) to expose its simulation environment, which allows it to be controlled from another HMI. "Previously, a whole simulation would have to be programmed separately, but MiMiC doesn't have to do this because it uses OPC servers that can communicate directly into the simulation environment," explains Gerken. "This connection allows users to write automation to an equipment database for valves or pumps, and spits the MiMiC environment to the HMI and back. So, if you have an HMI with graphic displays for those valves and pumps, then your operations guys can click in individual devices, open and close them in the simulation via the OPC server, and then simulate and run unusual conditions or alarms."