Process plants gain big advantages in their operations by getting information directly from OT into IT and from IT back into OT.

Making OT and IT connections click

Aug. 29, 2023
Connecting OT and IT networks can be beneficial—if done correctly

Connecting operational technology (OT) with information technology (IT) in process applications offers many distinct advantages for processing plants. While technology has made the task easier, there are still challenges to consider. That’s why following several best practices can save plant operators a lot of headaches down the road.

Andrew Thomas, founder and inventor of the Cogent DataHub technology powering Skkynet, talked with Control’s editor in chief, Len Vermillion, about its best practices for connecting OT and IT. Now CEO of the company, Thomas discussed system integration, software for real-time data communications and the interplay of OT and IT in modern plants.

Q: Let’s start by talking about the advantages of connecting OT to IT. What can you say about them?

A: The big one is timely access to information. If you isolate your IT and your OT networks, you end up with communications between them being human-mediated. This means slow communications and incomplete information that’s prone to errors. There’s a real advantage to getting that information directly from OT into IT, and from IT back into OT. Obviously, for production planning, resource management, safety monitoring analytics, and fault prediction, not everybody needs real-time data, but there’s a big advantage to not having to wait until the end of a shift or the end of the day to gain production insights.

One of the rules of communicating between IT and OT is that the OT network should never have a direct Internet connection. This means you must route data through IT to get to some of the up-and-coming services such as artificial intelligence (AI) systems that are cloud-based, or centralized monitoring where there are multiple, geographically diverse locations, centralized reporting and aggregation of data from different sources. All these need a mechanism to get data out to those processing locations, but you don't want that to be a direct connection out of OT.

So, this issue stands in the way when companies we deal with need to share data with third parties. They've got suppliers or customers, who want access to the real-time information coming off their processes, either for just-in-time supply or for insight into what the process is doing. Customers want to know what to expect. But they need a transmission path that isn't exposed to the risks associated with the Internet.

Q: We'd love to see everything go smoothly all the time, but there must be challenges involved. What are some of the challenges of connecting OT and IT?

A: I see three big ones, which I'd classify as availability, security and reliability. Availability is effectively the ability to provide data when it’s required in a form its consumers can use. This means designing networks that can deliver the data, and choosing protocols that deliver it in a usable form.

These aren’t independent choices. The protocol you choose will affect your network topology. Or, if you have a certain network topology in mind, that's going to limit the protocols you can choose. If you want a cloud service, it may expect a direct connection from OT into the cloud, which is another no-no. That will play into security.

I’d say that security and availability are inversely correlated. The more available you want to make your data, the less secure it's going to be and vice versa. So, you must adjust your availability requirements according to your security needs. When we look at something like industrial protocols, typically they're not designed with the goal of sharing data across networks. They're typically client/server designs, which are inappropriate when the server is in the protected network. You don't want to reach into the OT network to collect data from a server. You're probably going to end up with a mix of protocols depending on the leg of the journey your data is taking. So, all the way from the PLC or DCS right up to a cloud or an analysis system, you may need different protocols for each segment of the path the data traverses.

Then there's a reliability issue—preserving data during disconnections. If you lose a network connection or you have a hardware failure, you want to preserve as much data as you can during that time. That typically means you want to have store-and-forward capability for data, so you can deal with a network loss. Plus, you want redundancy, so you should have multiple data paths. If one path goes down, the other is available.

Once again, your choice of protocol determines how much redundancy or what sort of store-and-forward you can offer. That, again, limits protocol choices. If you rely on a particular vendor's redundancy solution, for example, it may limit your security capabilities or your network topology.

Q: Let's talk about how to set this up. What are some of the best practices?

A: IT and OT, in my opinion, should treat one another as hostile. I don't mean there's hostility between the two teams. People can get along, but the networks shouldn't. There shouldn’t be an opportunity for one network to compromise another due to an entry point between them. They should be entirely distinct from one another. No program should be able to connect from one network to the other.

To achieve this, you'll have to set up a mechanism that stands between the two networks. A network with a demilitarized zone (DMZ) is a common approach. Some people say, ‘we only open one port.’ I've heard this many times, and it really misses the point. It's not firewall ports that are attacked during an attempted compromise on a network. It's the software that runs on that port. It doesn't matter who you are or what your development team is like, there is always an exploit. The only way to protect yourself from exploit through an open firewall port is not to have an open firewall port.

Q: Are there any other ways to keep your system secure?

A: I’ve touched on communication security, but the biggest thing you can do is educate your humans. Technical defenses will take you a long way in terms of automated attacks, but the attacks that quite often bring a system down are social engineering attacks or they're accidental misconfiguration of network hardware. They're things that you can’t mitigate purely with technical means. They must be done through education, making sure the humans in the loop understand the risks, and are able to identify and defend against them.

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...