The Straight Scoop on OPC and Security

What You Need to Know about OPC-UA and OPC-Xi

Share Print Related RSS
Page 1 of 2 « Prev 1 | 2 View on one page

By Roy Kok

So you're considering OPC for an application, and with today's concerns over security, you want to make sure your choice is a good one. Is OPC capable? In order to answer that question, we need to ask and answer a few others first.

What is OPC? If you answer Orthodox Presbyterian Church, you should probably be reading another article, but if you answered OLE for Process Control (OPC), you're getting warmer. Today OPC simply stands for "Open Connectivity—Through Open Standards." The earlier acronym gives a major clue to OPC's beginnings.

The problem industry needed to solve in 1996 had to do with sharing data between disparate automation solutions. OPC was founded by a group of automation vendors and leveraged the best available technology at the time. It essentially defined a set of standard interfaces to be adopted within automation products for the exchange of three types of data: real-time data access through a specification called OPC-DA; alarm and event messages through a specification called OPC-A&E; and historic data through a specification called OPC-HDA. There are many more specifications available through the OPC Foundation (www.opcfoundation.org), but these are the most widely implemented.

A major goal in the development of OPC was to leverage best available technology. Why reinvent the wheel? The technology to leverage back in 1996 was Component Object Model (COM) and Distributed Component Object Model (DCOM). This is Microsoft technology, and it's at the heart of its operating systems even today. This technology makes up the plumbing for the transfer of data between applications. In one computer, connectivity between applications is handled by COM. Over networks, DCOM takes over. OPC simply defines the messaging (data naming and access conventions).  OPC leveraged existing Microsoft technology ensuring the reliable and efficient transport of data.

OPC does not define security. It leverages Microsoft security built into the operating systems. Again, why reinvent the wheel? In small network environments, that means workgroup level security. In larger environments, domain security takes over.

This presents us with a combination of both issues and opportunities. Plant engineers are expert at selecting software products, learning them and applying them. Hence, if the features are in a product, so is the solution—and they will be creative in crafting the solution. However, OPC leverages Microsoft's security, which is the same security that is managed throughout the enterprise by another group—namely IT. This means the effective use of OPC in a distributed application will require collaboration between plant floor engineering and corporate IT. There is no way around it.

Bad Configuration

The OPC Training Institute (www.OPCTI.com), a resource for OPC training around the world, estimates that a majority of distributed OPC applications are actually incorrectly configured. Randy Kondor, president, says, "Security is the first requirement of any distributed application. Customers always make sure they acquire products with the capability of being secure. Then during the implementation phase of a project, security is lost. Systems are hastily architected. As systems come together and begin to operate, attention shifts to the quality of data and the breadth of system implementation, and the initial plumbing that makes up the data transfer is forgotten and often left open and vulnerable. This tends to give OPC a bad name, one it does not deserve. The technology it is based on is excellent and can offer all the security an enterprise needs—but only if it is properly configured.

"There are also several myths about OPC that should be clarified," he continued. "First that OPC is not secure due to port uncertainties inherent in DCOM. While it's true that without configuration, DCOM will default to the use of a range of ports for communication, DCOM can also be tightly locked to the use of a minimum of two ports explicitly defined. This is important for any IT professional wanting to lock down his or her enterprise with a firewall. Additionally, I hear about OPC reliability as an issue in distributed applications. While it is also true that for performance reasons, OPC offers a subscription methodology, and a remote client is sent updates on change. A lack of updates could be stable data or a lost connection. However, most servers also offer variables such as counters or timers that are easily set up as a watchdog—ensuring that your system status is known at all times. OPC is at the heart of many automation products on the market today and isn't just technology used to bridge between solutions."

Tunneling

One solution to improving distributed applications involves using "tunneling" solutions, effectively replacing DCOM in these applications. Tunneling products are OPC clients and servers that have their own proprietary technology for network communications. They have the ability to bridge the gap between machines, even across the Internet, operating through network address translation features in switches, routers or firewalls. Note that these products have their own security settings and may introduce additional security challenges. Again, proper training is the key to success.

So far, we've been talking about the OPC technology developed back in 1996. Today that is known as OPC Classic, and while it has progressed in capabilities over the years, the underlying technology on which it is based (COM/DCOM) is being replaced. Also, there are new requirements for interoperability in automation. This is leading to a new set of OPC specifications.

In 2004, the OPC Foundation and vendors in the OPC Technical Advisory Council (TAC) set forth to upgrade OPC specifications for the 21st century. The new specifications included all the past data interoperability requirements as well as some requirements for unification of specifications to enable even greater levels of interoperability. This interoperability was for all data types with the ability to support data models and enhanced platform portability. This enabled OPC to be on servers and operate robustly in distributed applications both on intranets and across the Internet with built in security. Security was to be manageable by the engineer and be default—not a bolt-on. This new specification is known as OPC-Unified Architecture (OPC-UA).

Page 1 of 2 « Prev 1 | 2 View on one page
Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments