Bronco's staff also uses FactoryPMI to serve up data to help them manage their tanks and raw materials more efficiently. For example, if a tank's temperature is too high, they'll be notified, and can adjust their procedures. Bronco is also moving to adopt wireless tank gauging, which will be monitored by FactoryPMI.
Likewise, during the 24/7 grape harvesting season, known as "crush," FactoryPMI works with the company's ProPak software and its "grower relations" database. ProPak analyzes each load, FactoryPMI interrogates the ProPak database and compares this information with its own database. Because it's crucial for the right truck to dump into the right pit, they're only allowed to dump if their documentation matches correctly.
"It makes me a better manager," says Franzia. "Efficiencies have improved upwards of 30%, productivity targets are hit everyday, and I can be more responsive to the business and to my managers."
Guts of Virtualization
One of the most amazing aspects of the data processing revolution is that computing power has grown so fast that many applications haven't kept pace. So many PCs use only a small fraction of their capacity, and the rest goes largely unused. This is where virtualized computing comes in.
Honeywell Process Solutions (HPS, https://hpsweb.honeywell.com) reports that virtualization can slash the number of PCs needed to perform the same amount of data processing by 75% or more and produce equally huge savings in maintenance and power consumption. This is achieved by breaking the formerly unbreakable bond between the operating system (OS) software and hardware running traditional one-box PCs, and instead enabling one computer to run multiple OSs for multiple users at the same time.
"Users want to reduce the number of PCs in their facilities and their total cost of ownership (TCO), but they can only do it if they don't compromise existing safety, reliability or production," says Paul Hodge, Honeywell's product manager for Experion Infrastructure and HMI. "However, as PCs evolved, they became increasingly inflexible due to the tight coupling between their OSs and underlying hardware, so the challenge for virtualization is to break this coupling between these layers."
Hodge added that virtualization consists of three main families of computing technology that can enable much greater levels of computing flexibility and agility. These include platform virtualization, which extracts the OS from the hardware; application virtualization, which separates the application from the OS; and client virtualization, which extracts the user interface from the OS. "Without platform virtualization, users must run multiple applications on separate OSs in separate boxes, so they end up with very low utilization of their data processing workload," said Hodge. "However, computers have gotten much faster lately, so most applications only use 5% to 10% of their individual PC's resources, and this leaves a lot of those resources and money on the table."
Hodge added that, consequently, virtualization is achieved by placing a thin software layer, called a hypervisor, between the OS and underlying hardware, and this enables multiple OSs to run and be supported on one PC box. Also, this hypervisor includes a "virtual hardware layer" that emulates x86 computing, and gives it all the same operating parts and functions as a regular PC.
"Virtualization also improves site protection because users can 'snapshot' computers back to before problems occurred. It's also much easier to restore virtual machine files," said Hodge. "In fact, if an entire site somehow becomes unavailable, the whole site's virtualized computing workload can be moved from one location to another. Without virtualization, you have a large number of servers that can be hard to manage, interoperability problems, and hardware that's time consuming to procure. Platform virtualization reduces the number of servers, allows better server and client manageability, improves interoperability, but preserves needed isolation in the virtual machines, and increases server and user agility."
Ron Kambach, platform and supervisory applications product manager at Invensys Operations Management (www.iom.invensys.com), explains, "The basic benefits of virtualization include server consolidation with smaller OS footprint and virtualized hardware, and reduced costs by using less space, facilities, hardware, maintenance and power. Virtualization also provides application compatibility by using OS isolation to help run legacy and incompatible systems and applications, and allows centralized management, faster installation and deployment, and greater use of software templates. For example, users can snapshot multiple versions of virtual machine, so if one goes down, they can just go back the version from 10 seconds earlier. In fact, users can have a library of different devices and easily set up a virtual network or put together a sandbox of tools to meet the needs of particular applications. To accomplish these functions safely, host servers should always have spare resources about 25% above what the virtual, guest machines require."
However, Kambach adds that "Virtualization 2.0" enables more than consolidation. It also permits simpler installation and movement of software apps, lockdown of corporate PC images, better software distribution, backup images of virtual machines for quicker recovery, restacking workloads for much easier, on-the-fly work movement, isolation of hardware differences, and division of functions into smaller virtual servers. In addition, Kambach says that some market predictions for virtualization include the likelihood that the software "hypervisors" that enable them are going to become commodity items; management solutions will be available for sale from vendors; users will be able to set up either private or public cloud servers that include virtual machines; and their resources will be organized and managed as a "fabric" that includes optimization and lifecycle control.
Friendly Faces on New PCs
One of the perks of high-capacity data processing is that users can make initially alien-looking computing tools look just like familiar instruments and displays. For instance, National Fuel Gas in Williamsville, N.Y. (www.natfuel.com), recently partnered with engineering integrator EN Engineering in Woodridge, Ill. (www.enengineering.com), to upgrade a few of the 40 compressor stations that move natural gas over its 2877 miles of pipeline that bring gas to its 728,000 customers in western New York and northwestern Pennsylvania. The upgrade was also needed to help National take advantage of increased development and gas recovery in the local Marcellus Shale region.
The initial project upgraded 12 compressor units at two compressor stations, one in Roystone, Pa., and the other in Independence, N.Y. The Roystone station has eight Ajax compressor units, five headers, six operating configurations, and a storage field of 2.5 billion cubic feet (BCF). The Independence station has four Ingersoll-Rand compressor units, four headers, 10 operating configurations, a gas dehydration unit and 4.0 BCF storage field. The upgrade's main challenges were to understand and replicate functionality of the existing controls; integrate new control systems with existing systems; interface new control panels to existing equipment and instrumentation; and prevent disruption of operations during installation. (Figure 2)
"We used a unitized design concept, and then employed Rockwell Automation's ControlLogix PLCs with Flex I/O, as well as redundant PC-based HMIs with Factory Talk View SE at the station level, and PanelView operator interfaces with Factory Talk View ME at the unit level," reported Jennifer Shaller, National's lead electrical engineer. "We also used a plant-wide, fiber-optic control network with Stratix managed Ethernet switches, put all control functionality in a PLC, hardwired our shutdown circuits, and made sure we followed a Class 1, Division 2 design."
Shaller added that the upgrade has given National's two stations more consistent and reliable control, fully automated compressor operation, more efficient station operations, enhanced data collection, improved diagnostic and troubleshooting capabilities, improved reliability of the control systems, improved mechanical protection of integral compressor units, and opportunities for additional control functions.
"The new compressor controls have all the legacy look and feel that our operators needed, but they no longer have to deal with the stress of continually babysitting them," explained Shaller.