BESS builder virtualizes to streamline automation

System integrator George T. Hall helps battery and energy storage solution (BESS) builder deploy unified namespace (UNS) architecture
Feb. 20, 2026
6 min read

Key Highlights

  • System virtualization breaks down traditional hardware and software barriers, allowing more flexible and efficient industrial communication.
  • A case study of a battery plant demonstrates how virtualization and edge computing can unify data, improve decision-making and generate substantial cost savings.
  • Unified namespace (UNS) and publish-subscribe architectures streamline data sharing and reduce discrepancies across organizational layers.

Logically, the primary advantage and attraction of system virtualization is it lets users escape the rigid confines of PLCs and other traditionally rigid hardware and software, communicate more freely among layers that couldn’t interact directly before, and gain a bunch of new capabilities and efficiencies, including beginning to integrate some AI tools.

“We’ve been dealing with system virtualization during the past decade. We’ve moved from the platform side of PLC hardware with SCADA software and hardware, such as Wonderware, Pro-face and Ignition, and shifted to support and encourage virtualization of SCADA functions in software-based Docker containers,” says Dylan Lane, digital manufacturing systems manager at George T. Hall (GTH), a system integrator in Anaheim, Calif., and a certified member of the Control Systems Integrators Association. “Process control and automation lives in a niche, but we’re steadily moving toward virtualization and supporting low-code, no-code applications, which let non-controls people shift PLC functions and other control tasks to mainstream, IT-based, object-oriented programs.”

Lane reports that system virtualization can be understood as an attempt to modernize industrial automation, which often lags behind other technical disciplines because it’s practitioners don’t want to be on the bleeding edge of change due to their core mission of maintaining safety. However, even though it’s taking awhile, Lane observes that process industry organizations are catching up, and adding more applications to cloud-computing services enhanced by artificial intelligence (AI), and the promise of epic efficiencies and savings.

“Usually, upgrades to hardware in traditional software are costly and time-consuming, and process automation had to deal with these and other limits,” explains Lane. “So why would you bring a 30-year-old machine up to integrating with the cloud? Well, easier upgrades is another potential advantage of system virtualization. Plus, users can do on-premise virtualization from hardware. If an initial project goes well, then users can scale more similar improvements, more easily, change software platforms without changing hardware, and even add new CPUs, RAM and power more easily. And, they can upgrade without adding as many new and costly products and accessories. For example, middleware translates between other software packages, and translates and runs data pickup APIs for users. However, it’s a lot easier to add middleware layers if they’re on a virtual system because users no longer have to integrate them with physical nodes and other hardware.”

Building better batteries

For instance, Lane recently helped digitalize and virtualize a plant that produces battery and energy storage solutions (BESS), and wanted to modernize their production lines and increase throughput. Just like many decades-old factories, this facility had lots of disconnected machines and equipment, and other plant-floor and organizational islands and data silos, including wash-and-clean stations and assembly areas. It also had no central HMI/SCADA system, so shift leads manually passed information down to local HMIs, and wrote up production counts on paper templates for delivery to decision-makers on the business side.

“Staff had to plug into each station, and send information to a central database. They also used lots of paper and clipboards,” adds Lane. “It was a well-oiled operation, but there was also a question about how they could keep working without having just the right people in each position all the time?”

To help the BESS plant digitize, upgrade and integrate AI, Lane collaborated with the company to develop a modern data-collection and SCADA system with interconnected, web-based communications. Ironically, some new hardware was required to support this digitalization/virtualization project, including 155 edge servers to handle the terabytes of data generated by the facility’s approximately 15 million device tags, and replace the approximately 500 PLCs it needed to manage them.

“This plant also had some data acquisition (DAQ) devices for quality-assurance metrics, and tracking and tracing on its existing PLC network, but all of these machines and other parts were still islands with little coordination or scaling between them,” says Lane. “There were also many islands because this plant builds all types of batteries and BESSs from raw materials. It operates coil winders, jellyroll cathode-assembly equipment, electrolyte fillers, cappers and welders, and washing and cleaning equipment.”

Get your subscription to Control's tri-weekly newsletter.

Because these processes and their components, generate so much data, and require so much networking, cooling and support, Lane reports the BESS company couldn’t simply install a typical server architecture. However, it could distribute its data processing with edge-style servers, which would also help meet its goal of virtualizing as many functions as possible. To help the battery company better coordinate its Ethernet and TCP/IP networks with others using serial communications, these new servers run Ignition web-based SCADA software, and network via MQTT brokers or Kafka that allow connections to different machines at each level on their terms, with the goal of creating a central, unified platform. Kafka is a similar, process-streaming platform that handles more real-time data with low latency, fault tolerance, and can connect with many data streams.

UNS unchained

“The main technical philosophy here is having a unified namespace (UNS) in which participants use the same language, even as they collect data from different places,” explains Lane. “This virtualization lets them plug and play, and achieve a scalable publish-subscribe system that presents standardized data in a UNS supported by MQTT or Kafka. Overall virtualization is the key to making all this possible because it creates a system that makes data as redundant and cloneable as needed. Users can already integrate information, reliably and redundantly, but virtualization in Docker containers make it much easier.”

Lane reports that UNS is a conceptual architecture for standardizing communications around existing names and layers. It uses a semantic hierarchy to manage and communicate states of data, and essentially employs that semantic layer to let participants talk.

“The most important aspect of UNS is that it can skip the traditional network layers defined by the ISA-95 Purdue model,” explains Lane. “For example, if an enterprise resource planning (ERP) application is seeking data, then the applicable SCADA system can use UNS to skip the manufacturing execution system (MES) layer, and talk directly to the ERP, which is a lot more efficient.”

Beyond these organizational hurdles, Lane explains that data would previously change each time a new location or user touched it, so each SCADA, MES or ERP layer could possibly add multiple discrepancies due to their individual efforts to contextualize that data. Consequently, a request for measuring equipment efficiency might produce three different numbers from three different layers. Thanks to using its publish-subscribe methodology, UNS halts individual contextualization, and provides the temperature for an ice cream process or other parameters all from one place.

“In the case of our battery manufacturer, its digitalization, and virtualization effort was successful because they centralized their SCADA system virtually on server hardware,” adds Lane. “This allowed the company’s entire production system to collect data, and apply AI-aided decision-making, including specific AI models trained to solve large, multivariable problems based on initial data, such as determining where and what time to route raw materials. So far, the BESS company has implemented multiple digitalization/virtualization projects, and is saving $5-10 million per year.”

About the Author

Jim Montague

Executive Editor

Jim Montague is executive editor of Control. 

Sign up for our eNewsletters
Get the latest news and updates