ai_in_your_pocket

AI in your pocket

Jan. 24, 2024
Balancing automation with human engagement in a smartphone-distracted world

I’ve often thought the boss relished a chance to be a curmudgeon. It seemed to be the case when, after decades surviving without them, the crew suggested how useful automated valves would be for isolating some distillation columns. “Why do we need that?” he growled. “So operators can spend more time staring at their phones?”

It’s an old-school challenge. Management has a duty to ensure the company’s money is spent with discretion, and in today’s world, frugally.

That our process plants would be inhabited by brains addicted to dopamine-inducing smartphones wasn’t foreseen 20 years ago, when graphics-standards gurus extolled the need for dull grayscale graphics that were largely featureless until an alarm came along. Today, smooth-running automation—an office or shop free of challenges—lures all to the endless distraction of social media and the rest of the Internet. For years, the silicon geniuses of this content employed artificial intelligence (AI) to put forth content that kept their victims scrolling and searching. How could our discipline—as paid providers of often vital “content” such as measurements and alarms via a human-machine interface (HMI)—anticipate that a powerful, pocket-sized, Internet-connected computer would compete for the eyes we need on the process?

However, the same technology turning our workers into hypnotized distraction-seekers, drawn to their personal screen in every idle moment, is also useful to us, ironically. Once upon a time, HMI was misgendered as “MMI” for man-machine interface. But the “machine” we want our operators to engage with isn’t the computer, it’s the process. Today’s graphics packages are being retooled using the same HTML technology that powers web content, so distributed control system (DCS) graphics can be scaled to smaller screens. The ability to view DCS screens on other platforms isn’t new but try scrolling or “pinch-to-zoom” on such devices and you may be disappointed.

If you designed your graphics and faceplates for 16:9 aspect ratio LCD flatscreens from 24 inches and up, the ability to interact with them when shrunken to tablet- or smartphone-size can be frustrating. Try entering a new setpoint or changing the mode of a control loop on a 6-inch screen, especially if you have bratwurst-sized fingers.

The pocket computer is useful in other ways. Clever suppliers are happy to sell you a hazardous-area capable smartphone mounted to an ANSI-rated hardhat, which can be used to view DCS graphics or enable two-way live video and audio interaction with remote technical help. Its usefulness is predicated on the assumption that a remote assistant, who will likewise have the expertise and confidence to provide meaningful guidance, can be contacted in one’s hour of need. This, you may have noticed, is not a given. Just consider your inadequately named IT “help desk.”

Can a unified operator interface mend the broken, shared reality induced by Internet-connected screens? In the days before blockhouses and blast-resistant control buildings, operators were immersed in the physical process, checking local gauges and manipulating local controllers and manual valves, while listening to pumps and machinery for potential issues. For years, we’ve reflected on the lost paradigm of the central control room “board,” where a house operator could scan a 30-yard wall of controllers for deviations, and an unbroken graphic sparsely populated with some key alarms spanned the same wall above. Today, we have the potential to take most of it out into the plant, with us not only on tablets, smartphones and other screens, but also through augmented reality (AR)—the interface blended with views of actual reality.

Imagine if you could provide the details and precision of modern sensors and instrumentation, projected in context with one’s surroundings. Novice operators, who might struggle to connect the dull grayscale control house trends and graphics to the physical process, could accelerate the learning curve to make connections, diagnose issues, and avoid errors in real-time.

Today, we have the mounting challenge of dragging both young and old minds from their echo chambers into the shared reality of the process. Their safety and the continued reliability, efficiency and sustainability of the plant will depend on it.

About the Author

John Rezabek | Contributing Editor

John Rezabek is a contributing editor to Control