Simulation Workbench Reduces Start-Up Complexity, Uncertainty

Engineering Toolbox Helps Shorten the Time Required to Develop an Operator Simulator

Share Print Related RSS

Reducing risk and improving delivered system quality are the primary goals of a new package of internal tools and processes that the Invensys Operations Management global delivery team has developed and discussed at the company's North America Client Conference this week in Houston. The new workbench tools, which already are being used in pilot projects, are designed to give more time to firm up requirements and define project data, automatically validate design before configuration, test a system throughout its life cycle and use medium-fidelity simulation for earlier operator training. It takes design information and combines it with engineering standards and automation rules, and the automation tools create the control, safety and simulation configuration.

"Automation projects face two types of challenges—the pinch and the lever," said Greg McKim, principal consulting engineer, dynamic simulation, at Invensys. "The pinch can come from increasingly complex plant designs with shorter drawing-board-to-completion times. Changes can come from the end user, the engineering firm or the vendors. The automation system is a last-in-line choke point, but the start-up date can't change."

The lever, on the other hand, can work for or against you. "The automation contract is a small percentage of the total cost of the project," explained McKim. "At a typical big process plant start-up, every lost day is lost revenue. A traditional factory-acceptance-test (FAT) process verifies that we built what you told us to build, but it doesn't validate that the system will work. The solution is to automate the process and validate the design."

Every project begins with the award of the contract, explained David James, principal engineer, Invensys Operations Management. "Because of the nature of the projects, they're complex," he said. "Things aren't quite ready when we need them. The information we're working on continues to change. And if we're introducing a training simulator, then that adds more problems, because it's supposed to be based on what actually is there at the end."

The Invensys team developed a set of tools called the Engineering Workbench that uses a series of rules to auto-generate the application. "Once we understand the requirements and standards, we can take multiple data deliveries," said McKim.

The Workbench tools address requirements definition, design, configure/build, test, install/commission and operate/maintain phases of the system life cycle.

"We've designed tools to use the data in the form you have it, rather than put the burden on the user to convert it," said James. "To capture design in a data-centric format and collaboration in design, there's integrated design data. To capture Invensys and client engineering decisions, there are automation rules and libraries. And for pre-tested and documented modular engineering standards, there are engineering standards and best practices."

Engineering Workbench, which sits above the tools, takes the templates developed during the non-critical phase. "It takes rules from the client's head," explained James. "Using those templates and rules, it automatically generates the complete control module database and safety module database."

Having addressed the pinch, McKim then addressed the lever, specifically as it pertains to tieback simulation.

"A virtual control system eliminates the need for hardware," he said. "Traditional tieback gets its name from looping I/O modules back. We replace the hardware with FSIM, and we have a tool for automating the tieback simulation. This tool has a rulebook with search-and-paste criteria."

Because P&IDs have become smarter, the group was able to develop an auto-model-generator. "It takes about one second per P&ID to make a model," said McKim. "There's a big time savings in terms of building the model. It picks up all the equipment and all the transfers, so it allows you to automatically tie it to the control system."

McKim also differentiated verification from validation. "Verification says, ‘We built one loop, and it works,'" he said. "Validation is a semi-representation of the process, and you can go through and make sure the application works. And debugging it virtually, instead of on the live unit, is much faster."

Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments