Virtual reality is getting more like real reality every day.
Historically, simulation was the realm of NASA, aerospace, big oil-and-gas applications and other deep-pocket processes. However, ever faster and cheaper data processing is bringing simulation to masses of new users and types of applications in a seemingly inevitable progression from design and configuration, onward to training for routine and exceptional events, and more recently to optimization.
But why stop there? By bringing in more real-time data to make simulations more true-to-life, it soon becomes obvious they also can point out ways to improve an existing application's performance in the same way earlier simulations identify bugs and other fixes needed at the configuration and testing stages. Finally, as data processing keeps accelerating, moving information between a physical operation and its simulation is becoming easy and fast enough for some simulations to closely assist—if not actually run—some real-time operations. Simulation's next stop? Real-time process control.
Still, even though simulation is moving into new realms, this doesn't mean it's forgetting its roots in design.
"Modeling and simulation have already transformed military, nuclear, aviation, automotive and durable goods manufacturing, and now they're starting to transform our consumer packaged goods industry for the same reasons—it costs too much and takes too long to build physical learning cycles, and so the products they produce aren't innovative," says Tom Lange, Procter & Gamble's (www.pg.com) modeling and simulation director, who presented "Virtual Prescience" earlier this year at ARC Advisory Group's (www.arcweb.com) annual forum event in Orlando, Fla.
"In our modeling world, we build the first prototypes, and they fit, work and make financial sense. We do stuff before it exists in the real world because the computers we have now are faster than the fastest computers in the world just 10 years ago. All this computing power is allowing us to replace physical cycles with virtual ones and pursue realism," explains Lange. "So instead of building a model, doing some calculations, getting close to the physical experiment and securing some guidance, realism means making a model that's indistinguishable from the physical experiment. This requires using computing power for much bigger and more complex problems—doing parametric studies instead of point estimates. We're no longer interested in what's going to happen with one simulation. We want to know about the 64 designs around it or doing 128 runs of all the experiments around one finite element."
For example, physically testing mixing of liquids used to involve dumping in materials and just seeing where they went, but this real-world method didn't scale up from small models to larger tanks. However, simulation that includes computational fluid dynamics (CPD) can be easily scaled up, according to Lange. "Likewise, if we're trying to mix some dense, viscous fluids, they may not mix in a tank. If we did this experiment in reality, we'd now have to clean the tank ourselves. So using a simulation is more preferable because it can help us decide when to use a tank or a static mixer (Figure 1)," explains Lange. "And if we go with the static mixer, the simulation also can show how long it has to run, what its pressure drop will be and how well the material will be mixed. These are all questions that are entirely reasonable to be answered computationally. In their heads, people don't want to believe that simulations can be this accurate, but there are many times when the simulation performs better than the experiment."