You're almost certainly using the cloud on a regular basis for personal purposes. And odds are you're also using it at your company for business purposes.
You use the cloud for personal purposes if you have a web mail account (Gmail, Yahoo, etc.); if you have a social media account (Facebook, Twitter, etc.); if you've ever downloaded music or a movie; if you've used a file transfer site; if you store data such as photos or documents on the web; or if you've ever downloaded software from the web.
"We see private cloud technologies becoming commonplace in the industrial space because they provide redundancy, project backup and easy restoration, while adding the benefit of reduced hardware costs," says Steve Schneebeli, lead systems engineer at Malisko Engineering.
The cloud can be confusing, so let's start with a few definitions, namely what constitutes a public, a private and a hybrid cloud, and what types of services are typically provided through each.
Also Read: Cloud-Based Asset Management
What Kind of Cloud Is This?
"A public cloud infrastructure is owned by an organization, and that organization typically provides access to its cloud services for a fee or in exchange for subjecting the user to advertising," explains Larry Combs, vice president of customer service and support for InduSoft. Web mail is a good example, as are file storage and transfer sites.
The table on page 62 lists some of the advantages of using the public cloud instead of an internal infrastructure. In almost all cases, the public cloud will be much cheaper, faster to bring online and easier to expand. For applications that require large file downloads, such as software updates, the faster local access provided by a public cloud is a virtual necessity.
A private cloud infrastructure is operated by and for a particular organization, and it may exist either on or off its premises. A virtualized server farm within a process plant would be a good example of an on-premise, private cloud.
Hybrid clouds are a type of public cloud hosted for a particular application or customer. An example would be an application hosted by a cloud service provider for one of its customers, with the particular application and customer separated from all others.
In all cases, virtualization is used in the cloud to allow multiple operating systems and associated applications to run on a smaller number of computers than would be required with a traditional one PC/one operating system architecture.
Virtualization obviously saves money, space and energy because fewer PCs are needed, but its chief advantages are greater reliability, improved application longevity and simpler upgrades and changes.
Public and hybrid cloud computing services can be divided into three categories: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS).
"IaaS provides on-demand provisioning by a cloud service provider to a customer of virtual servers, storage, networks and other fundamental computing resources," notes Combs. It can be provided as a public cloud, as with commercial file storage services, or as a hybrid cloud. In either case, customers only pay for the computing resources that they use, and they can quickly bring additional capacity and resources online as needed.
"PaaS is a set of software and product development tools hosted on the cloud provider's infrastructure and used by customers as desired. Developers use these tools to create applications via the Internet. Google Apps is a leading example, with Google providing word processing and other web-hosted applications. PaaS is almost always provided as a public cloud," adds Combs.
SaaS, like web-based email, affords consumers the capability to use a provider's applications that are running on a cloud infrastructure from various devices such as a PC, a smart phone or a tablet—often through an app or a web browser. Consumers generally pay a fee or agree to be subjected to advertising for this public cloud service.
And it turns out that SaaS has found a home in the process industries, namely for remote access. With remote access and other related applications, SaaS makes the cloud-based computing infrastructure someone else's responsibility, freeing the process automation professional to focus on operational functions as opposed to IT matters.
SaaS Improves Remote Access
With SaaS for remote access, a supplier creates a cloud-based application that can communicate to various types of hardware and software platforms such as RTUs, PLCs and operator interface terminals installed at remote sites. The application can also communicate to remote access hardware including smart phones and tablets, PC-based HMI platforms, and databases.
The supplier then markets this cloud-based, remote-access SaaS to customers, charging a monthly fee for use, which is typically based on the number of nodes and the amount of data that goes through their cloud. The advantage to end users is that they only need to provide connectivity to the cloud from each remote site, with all other data communication infrastructure provided by the supplier for a monthly fee.
SoftPLC's TagWell is a good example. "TagWell is a cloud-based, bi-directional portal to SoftPLC remotes, which uses an application platform interface (API) to allow customers to perform remote management of their process automation systems," explains Cindy Hollenbeck, vice president of SoftPLC (Figure 1).
"SoftPLC remotes can be gateways to existing equipment, or they can be a PAC and gateway. With the API, applications that run in TagWell can read/write to the tags in any remote," adds Hollenbeck. When the remote is a gateway, any vendor's automation system can be used, as long as it can communicate to the gateway. In other cases, the SoftPLC remote is the main controller for the application as well as for the gateway.
"The entire architecture is designed to minimize bandwidth use to cut costs for cellular or other costly communication network interfaces for remote systems such as satellite, but it can also work on hard-wired Ethernet connections. The remote can be programmed to report only by exception to further reduce bandwidth requirements," notes Hollenbeck.
One SoftPLC customer uses TagWell for monitoring chemical tank levels. Each tank is equipped with an embedded SoftPLC RTU that measures level. The level is reported to TagWell where all the tanks can be viewed in a browser optimized for viewing on a smart phone. TagWell also provides the tank level information to the customer's SCADA system once per day, and immediately reports refill alarms, so the logistics system can schedule deliveries. Critical low-level alarms are sent to the SCADA system and as text messages.
Vipond Controls is a system integrator in Calgary, Alberta, Canada, that provides a hybrid, cloud-based, SCADA SaaS solution based on InduSoft's Web Studio software to its customers in the oil and gas industry. Vipond customers use its cloud-based remote access solution instead of purchasing and installing SCADA software and hardware at each remote site.
A typical installation consists of one or more controllers and/or RTUs at a remote production site, such as an oil well, explains Darryl Vipond, president of Vipond Controls. Each of these local devices is connected to Vipond's cloud-based iSCADA via radio, cellular or satellite connections. No SCADA hardware or software is required at the site because iSCADA provides this function remotely.
Once the data is uploaded to the iSCADA application in the cloud, it's available for remote access. "A key feature of iSCADA is very fast response rates, which enable us to deliver a remote HMI experience in near-real time," Vipond says. "This remote viewing can be delivered through any web browser, a PC set up as a thin client, or a smart phone such as the iPhone and certain Android phones."
Vipond adds, "Our SCADA solution creates a unique experience for each client by using a hybrid cloud. With iSCADA, each customer has their own virtual machine running within Vipond's server cloud. All data is kept safe and independent of other machines running in the cloud." (Figure 2)
Remote asset management is another process application that fits well with the SaaS model (Cloud-Based Asset Management). Remote access and asset management are public cloud services, but private clouds are perhaps the most widespread implementation in process industry end user firms.
Private Cloud Savings
Process industry firms are very concerned with protecting their automation and information systems from cyber attacks. They also demand nearly 100% uptime, and are leery of cloud implementations that require an Internet connection, so private clouds often make the most sense for process automation applications.
A certain major pulp and paper company uses virtual machines in a private cloud to host the application software for the distributed control systems (DCSs) in some of its largest manufacturing facilities. Besides running the main application programs, it uses the cloud to host thin-client HMI stations for its operators.
In its largest private cloud implementation, it is using VMware. VMware provides a set of software tools that help users virtualize PCs to run multiple operating systems on one machine.
For just the DCS part of that installation, the company had roughly 30 servers that now run on 12 virtual machines. There were about 60 client workstations, and those are now hosted by 10 virtual machines. Using a virtualized private cloud to reduce the number of PCs from 90 to 22 cleared up a lot of floor space, and cut the firm's heat load dramatically.
Besides these benefits, according one of its engineers, the pulp and paper company has "broken the hardware-software link that often drives us into much more expensive repairs and upgrades than should be required. This happens when something like a hard drive fails in a relatively old box we're running as part of a process control system. Then, when we get the new hard drive, we find there are no drivers (or other incompatibilities) available for it on the old box. Next step is looking at a new server that—guess what?—won't run the old application software. And then, you've reached the point where a failed hard drive results in a major hardware and software upgrade. Admittedly, this wouldn't happen if everything in all our systems was up to date—but everything's not up to date."
This example points out a major benefit of virtualization and private clouds, and that's application and hardware platform longevity. With a virtualized private cloud, a hardware failure on a PC simply requires a transfer of the applications running on that PC to another PC in the cloud.
Depending on the cloud configuration, these types of transfers can occur either manually or automatically, and in either case, very quickly, as opposed to what's needed in a traditional installation. For this company, transfers occur automatically, as it uses VMware technology to run mirrored machines with automatic switchovers.
With most every cloud implementation, including those at the pulp and paper company, the availability of a spare virtual application server allows testing of software patches and other upgrades with greatly reduced risk to operations, another reason to move to the cloud.
Many PC-based installations require periodic upgrades for various reasons, such as the upcoming discontinuation of support for the Windows XP operating system. This can be an ideal time to switch from a traditional one PC/one operating system installation to a virtualized private cloud.
"We recently replaced a set of eight stand-alone rack-mount servers that were five years old and due for replacement with two rack-mount host servers designed to run the equivalent of the existing system as virtual machines," reports an automation engineer user at a large water/wastewater utility in Southern California.
The utility uses VMware's ESXi 5.0 software, and Wonderware's Archestra (www.wonderware.com) is the HMI/SCADA system. The following standalone machines were converted to virtual machines in a private cloud, and are running fully redundant with the ability to have one of the two host machines fail with no loss of SCADA functional Archestra System Platform object servers, Wonderware historian, domain controller, terminal server and I/O servers (communications to PLCs). Besides greater reliability, this private cloud installation gives users the ability to add virtual machines to host servers to expand SCADA system capacity.
The downside of the private clouds is that they require significant internal expertise. Despite this, strong growth is expected to continue.
How Far Can the Cloud Go?
Private clouds are already in widespread use in the process industries, and remote access and asset management SaaS solutions should continue rapid growth. Users trust private clouds in critical real-time control applications because they're an onsite solution contained within a particular facility. At least some users are comfortable using cloud-based remote access and asset management, probably because these services don't directly affect real-time control. It seems that this will be a pattern going forward, as process industry firms will limit cloud use in real-time control to private clouds, and will only trust SaaS providers to the extent that they don't directly affect real-time control.
"At this point, our company doesn't use clouds in our server level and other process automation applications, primarily because we view it as too risky due to a lack of cloud standards and insufficient knowledge," cautions Rick Hakimioun, a senior instrument/electrical and control systems engineer with Paramount Petroleum.
"Process automation professionals must go through a paradigm shift to start taking advantage of cloud innovations, and I see it happening in few years. Of course, we must first fully understand how it works, and we will need to have the control systems suppliers' blessings of the cloud before jumping on the bandwagon," observes Hakimioun.
"Security and a lack of standards are the biggest concerns. IEEE, ANSI, ISA and other non-profit organizations will need to be the forerunners on putting together the security and interoperability standards for process industry cloud-based applications, not vendors like Google and Microsoft," he adds.
Automation vendors will have to be fully on board also, as Hakimioun and other end users tend to trust their reputable automation suppliers over others when it comes to maintaining a secure, cloud-based system.
A system integrator seconds Hakimioun's opinions. "Private clouds are popular, but public cloud services such as Amazon's EC2 are less likely to be used for two reasons. Control networks are usually segregated, and do not have access to the Internet, and manufacturing customers like to have their servers inside the plant, minimizing points of failure," observes Chuck Toth, MSEE, a consultant with Maverick Technologies.
Other cloud caveats include dependence on the continued existence of the cloud services provider, and dependence on reliable and high-speed Internet connections. If the cloud services provider were to go out of business, then a process company's entire cloud-based remote access or asset management system would fail completely and instantly. Although the customers of cloud-based storage provider Nirvanix didn't lose all their data instantly, they were only given days to scramble to find a new supplier when the firm recently failed.
Perhaps these types of failures and wariness of others are why many end users want to see their main automation system vendors as providers of cloud-based services.
And no matter who provides the cloud service, all public cloud operations, unlike private clouds, depend on the reliable operation of high-speed Internet and other external communication systems.
The cloud is here to stay in process automation, and its use will spread in the form of virtualized private clouds, and for remote access and asset management applications. Further penetration of public cloud-based applications will require increased participation of automation system suppliers and standards organizations.