Industrial Computers, Part 2. Data Processing Escapes the Enclosure

Whether It Happens on a Cloud-Based Service, Virtualized Server or Plain Old Wireless, Internet or Ethernet, It's Clear That Industrial Computing for Process Control Has Moved Beyond Its Old Laptops and Desktops. So How Can You Protect Such Far-Flung Data Processing?

Share Print Related RSS
Page 2 of 3 1 | 2 | 3 View on one page

Similarly, Ken Cullum, maintenance manager at the Ceres winery adds that, "Our refrigeration guys used to record the same data in four different places. Now they enter it via the web at any FactoryPMI station."

In addition, though its four main facilities are many miles apart, Bronco needs them to appear on-screen as if they were under the same roof. Fortunately, FactoryPMI's project redirection feature allows that to happen. There are presently six servers running FactoryPMI, including four in Ceres, one in Escalon and one in Napa. Each server has certain projects running on it. However, when a user needs to view a different part of the operation, the software redirects the client to the required project, even if it's on a different server. This "server clustering" method also allows FactoryPMI's servers to be redundant and run the same projects. In the future, Bronco plans to configure the servers in a clustered environment to provide added redundancy.

Bronco's staff also uses FactoryPMI to serve up data to help them manage their tanks and raw materials more efficiently. For example, if a tank's temperature is too high, they'll be notified, and can adjust their procedures. Bronco is also moving to adopt wireless tank gauging, which will be monitored by FactoryPMI.

Likewise, during the 24/7 grape harvesting season, known as "crush," FactoryPMI works with the company's ProPak software and its "grower relations" database. ProPak analyzes each load, FactoryPMI interrogates the ProPak database and compares this information with its own database. Because it's crucial for the right truck to dump into the right pit, they're only allowed to dump if their documentation matches correctly.      

"It makes me a better manager," says Franzia. "Efficiencies have improved upwards of 30%, productivity targets are hit everyday, and I can be more responsive to the business and to my managers."

Guts of Virtualization

One of the most amazing aspects of the data processing revolution is that computing power has grown so fast that many applications haven't kept pace. So many PCs use only a small fraction of their capacity, and the rest goes largely unused. This is where virtualized computing comes in.

Honeywell Process Solutions (HPS, https://hpsweb.honeywell.com) reports that virtualization can slash the number of PCs needed to perform the same amount of data processing by 75% or more and produce equally huge savings in maintenance and power consumption. This is achieved by breaking the formerly unbreakable bond between the operating system (OS) software and hardware running traditional one-box PCs, and instead enabling one computer to run multiple OSs for multiple users at the same time.

"Users want to reduce the number of PCs in their facilities and their total cost of ownership (TCO), but they can only do it if they don't compromise existing safety, reliability or production," says Paul Hodge, Honeywell's product manager for Experion Infrastructure and HMI. "However, as PCs evolved, they became increasingly inflexible due to the tight coupling between their OSs and underlying hardware, so the challenge for virtualization is to break this coupling between these layers."

Hodge added that virtualization consists of three main families of computing technology that can enable much greater levels of computing flexibility and agility. These include platform virtualization, which extracts the OS from the hardware; application virtualization, which separates the application from the OS; and client virtualization, which extracts the user interface from the OS. "Without platform virtualization, users must run multiple applications on separate OSs in separate boxes, so they end up with very low utilization of their data processing workload," said Hodge. "However, computers have gotten much faster lately, so most applications only use 5% to 10% of their individual PC's resources, and this leaves a lot of those resources and money on the table."

Hodge added that, consequently,  virtualization is achieved by placing a thin software layer, called a hypervisor, between the OS and underlying hardware, and this enables multiple OSs to run and be supported on one PC box. Also, this hypervisor includes a "virtual hardware layer" that emulates x86 computing, and gives it all the same operating parts and functions as a regular PC.

"Virtualization also improves site protection because users can 'snapshot' computers back to before problems occurred. It's also much easier to restore virtual machine files," said Hodge. "In fact, if an entire site somehow becomes unavailable, the whole site's virtualized computing workload can be moved from one location to another. Without virtualization, you have a large number of servers that can be hard to manage, interoperability problems, and hardware that's time consuming to procure. Platform virtualization reduces the number of servers, allows better server and client manageability, improves interoperability, but preserves needed isolation in the virtual machines, and increases server and user agility."

Ron Kambach, platform and supervisory applications product manager at Invensys Operations Management (www.iom.invensys.com), explains, "The basic benefits of virtualization include server consolidation with smaller OS footprint and virtualized hardware, and reduced costs by using less space, facilities, hardware, maintenance and power. Virtualization also provides application compatibility by using OS isolation to help run legacy and incompatible systems and applications, and allows centralized management, faster installation and deployment, and greater use of software templates. For example, users can snapshot multiple versions of virtual machine, so if one goes down, they can just go back the version from 10 seconds earlier. In fact, users can have a library of different devices and easily set up a virtual network or put together a sandbox of tools to meet the needs of particular applications. To accomplish these functions safely, host servers should always have spare resources about 25% above what the virtual, guest machines require."

Page 2 of 3 1 | 2 | 3 View on one page
Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments