1660605245941 Cg1010 Distributedcontrols

The Straight Scoop on OPC and Security

Oct. 5, 2010
What You Need to Know about OPC-UA and OPC-Xi

By Roy Kok

So you're considering OPC for an application, and with today's concerns over security, you want to make sure your choice is a good one. Is OPC capable? In order to answer that question, we need to ask and answer a few others first.

What is OPC? If you answer Orthodox Presbyterian Church, you should probably be reading another article, but if you answered OLE for Process Control (OPC), you're getting warmer. Today OPC simply stands for "Open Connectivity—Through Open Standards." The earlier acronym gives a major clue to OPC's beginnings.

The problem industry needed to solve in 1996 had to do with sharing data between disparate automation solutions. OPC was founded by a group of automation vendors and leveraged the best available technology at the time. It essentially defined a set of standard interfaces to be adopted within automation products for the exchange of three types of data: real-time data access through a specification called OPC-DA; alarm and event messages through a specification called OPC-A&E; and historic data through a specification called OPC-HDA. There are many more specifications available through the OPC Foundation (www.opcfoundation.org), but these are the most widely implemented.

A major goal in the development of OPC was to leverage best available technology. Why reinvent the wheel? The technology to leverage back in 1996 was Component Object Model (COM) and Distributed Component Object Model (DCOM). This is Microsoft technology, and it's at the heart of its operating systems even today. This technology makes up the plumbing for the transfer of data between applications. In one computer, connectivity between applications is handled by COM. Over networks, DCOM takes over. OPC simply defines the messaging (data naming and access conventions).  OPC leveraged existing Microsoft technology ensuring the reliable and efficient transport of data.

OPC does not define security. It leverages Microsoft security built into the operating systems. Again, why reinvent the wheel? In small network environments, that means workgroup level security. In larger environments, domain security takes over.

This presents us with a combination of both issues and opportunities. Plant engineers are expert at selecting software products, learning them and applying them. Hence, if the features are in a product, so is the solution—and they will be creative in crafting the solution. However, OPC leverages Microsoft's security, which is the same security that is managed throughout the enterprise by another group—namely IT. This means the effective use of OPC in a distributed application will require collaboration between plant floor engineering and corporate IT. There is no way around it.

Bad Configuration

The OPC Training Institute (www.OPCTI.com), a resource for OPC training around the world, estimates that a majority of distributed OPC applications are actually incorrectly configured. Randy Kondor, president, says, "Security is the first requirement of any distributed application. Customers always make sure they acquire products with the capability of being secure. Then during the implementation phase of a project, security is lost. Systems are hastily architected. As systems come together and begin to operate, attention shifts to the quality of data and the breadth of system implementation, and the initial plumbing that makes up the data transfer is forgotten and often left open and vulnerable. This tends to give OPC a bad name, one it does not deserve. The technology it is based on is excellent and can offer all the security an enterprise needs—but only if it is properly configured.

"There are also several myths about OPC that should be clarified," he continued. "First that OPC is not secure due to port uncertainties inherent in DCOM. While it's true that without configuration, DCOM will default to the use of a range of ports for communication, DCOM can also be tightly locked to the use of a minimum of two ports explicitly defined. This is important for any IT professional wanting to lock down his or her enterprise with a firewall. Additionally, I hear about OPC reliability as an issue in distributed applications. While it is also true that for performance reasons, OPC offers a subscription methodology, and a remote client is sent updates on change. A lack of updates could be stable data or a lost connection. However, most servers also offer variables such as counters or timers that are easily set up as a watchdog—ensuring that your system status is known at all times. OPC is at the heart of many automation products on the market today and isn't just technology used to bridge between solutions."

Tunneling

One solution to improving distributed applications involves using "tunneling" solutions, effectively replacing DCOM in these applications. Tunneling products are OPC clients and servers that have their own proprietary technology for network communications. They have the ability to bridge the gap between machines, even across the Internet, operating through network address translation features in switches, routers or firewalls. Note that these products have their own security settings and may introduce additional security challenges. Again, proper training is the key to success.

So far, we've been talking about the OPC technology developed back in 1996. Today that is known as OPC Classic, and while it has progressed in capabilities over the years, the underlying technology on which it is based (COM/DCOM) is being replaced. Also, there are new requirements for interoperability in automation. This is leading to a new set of OPC specifications.

In 2004, the OPC Foundation and vendors in the OPC Technical Advisory Council (TAC) set forth to upgrade OPC specifications for the 21st century. The new specifications included all the past data interoperability requirements as well as some requirements for unification of specifications to enable even greater levels of interoperability. This interoperability was for all data types with the ability to support data models and enhanced platform portability. This enabled OPC to be on servers and operate robustly in distributed applications both on intranets and across the Internet with built in security. Security was to be manageable by the engineer and be default—not a bolt-on. This new specification is known as OPC-Unified Architecture (OPC-UA).

OPC-UA

Unlike OPC Classic, where security is a benefit of the underlying technology, OPC-UA specifically defines the security to be implemented by vendors. It is a core component of the specifications. All products implementing OPC-UA must implement OPC-UA security, and although "none" is an option, it is now a conscious decision that can be reversed as easily as it was chosen. OPC- UA leverages today's standards in security, including message encryption and identity certificates. Security is enhanced to the application level. Clients and servers must exchange certificates in order to interact with each other. These certificates are based on the x509 standards.

OPC-UA clearly changes the way in which security will be implemented in future UA-based automation systems, making them more secure by empowering the process engineer with the ability to design and implement the flow of data with application-to-application security, even security with respect to communications with embedded systems and devices. Higher level applications, for example, HMI/SCADA, will be implementing additional user-based security, typically based on Microsoft standards. OPC-UA based products started reaching the market back in 2008 and are available from a variety of vendors at all levels of the application spectrum.

OPC-Xi

Starting in 2008, several vendors recognized the need for a higher-level interoperability standard, leveraging the latest .NET development tools and Microsoft standards such as Windows Communications Foundation (WCF). This technology is intended to be used between higher level applications, such as HMI/SCADA, communication drivers, historians and higher-level business systems. This technology was adopted by the OPC Foundation early in 2010, and is now known as OPC-Express Interface (OPC-Xi). Unlike OPC-UA, which delivers sample code along with its specifications in order to deliver the functionality a developer requires from server to sensor level products, OPC-Xi is primarily a specification that leverages available new technologies on the higher level platforms.

The benefit of OPC-Xi is that it unifies the OPC Classic specifications, and leverages current Microsoft technologies as the data transport, both within a PC or across any distributed architecture. Security is implemented as any other WCF application, leveraging existing IT personnel knowledge. OPC-Xi products are being showcased today.

Both OPC-UA and OPC-Xi are designed with the latest security standards, encryption and authentication in mind. Both OPC-UA and OPC-Xi enable communications across both intranet- and Internet-based environments, thereby essentially rendering separate tunneling products unnecessary.

OPC's Future

What's the future of OPC? Well, a brief review of its hisotry may show where it's going. Thomas Burke, president of the OPC Foundation, explains that, "After being introduced back in 1996, OPC has become the interoperability standard at the level above fieldbus and vendor protocol levels. The OPC Foundation now boasts over 400 members, distributed worldwide. There are thousands of OPC-compatible products on the market and OPC implementations numbering in the millions of nodes."

He adds, "The adoption of new OPC Technologies will likely be different from the days of OPC Classic. In the past, interoperability within the walls of the plant was the Holy Grail. It was OPC that delivered a common way to connect systems from disparate vendors, even systems from competitors. Today, this interoperability is largely taken for granted, and we are now faced with new challenges, primarily centered on system security."

The security we need in future automation systems will differ from what we need today as we are faced with the need to exchange data at a public level. Automation systems will be bridging to smart-grid control systems. They will make better use of real-time information from public sources, such as weather, dynamic raw material costs and new variable energy costs. In addition, enhanced and reliable communications over the Internet will enable new levels of outsourced services, enabling system integrators and other service providers to securely access the information they need for performance optimization, compliance reporting, equipment maintenance, etc.

However, while systems are being enhanced through OPC improvements and the ability to leverage new technologies, it is still incumbent on the design engineer to learn the technologies, involve the right IT personnel and implement the tools to their fullest capability. The security of a system is only as good as the engineers that implement it.

Training is also essential to understand the new technologies. More important, training will show you what you don't know. Any engineer can tinker till something works, but today, with widely distributed systems, it is the attention to detail that will make all the difference. 

Roy Kok is a consultant with www.AutomationSMX.com.

Sponsored Recommendations

Measurement instrumentation for improving hydrogen storage and transport

Hydrogen provides a decarbonization opportunity. Learn more about maximizing the potential of hydrogen.

Get Hands-On Training in Emerson's Interactive Plant Environment

Enhance the training experience and increase retention by training hands-on in Emerson's Interactive Plant Environment. Build skills here so you have them where and when it matters...

Learn About: Micro Motion™ 4700 Config I/O Coriolis Transmitter

An Advanced Transmitter that Expands Connectivity

Learn about: Micro Motion G-Series Coriolis Flow and Density Meters

The Micro Motion G-Series is designed to help you access the benefits of Coriolis technology even when available space is limited.