Is Your HMI Lying to You?

Control Report: Sometimes HMIs Fail to Tell the Entire Truth About Process Control Conditions

Share Print Related RSS

Running a process plant is often described as long periods of boredom, interrupted by moments of sheer panic. I recently read a book like that: Taming HAL: Designing Interfaces Beyond 2001 (By Asaf Degani, St. Martin's Press, ISBN 0 312 29574 X, $35, hardcover). This book has several long, boring passages explaining how autopilots work, interrupted by moments of horror, when the author describes how ships run aground and airplanes crash. In all cases, the mishaps are because of an operator interface that wasn’t up to the job.

 

The author’s premise is that the HAL 9000 supercomputer in the movie 2001: A Space Odyssey is just the most well-known example of an automation system that ran amuck without warning its human operators via an HMI. HAL made mistakes, did not inform his operators that he had control problems, and eventually went berserk, endangering the mission. The book demonstrates how automation systems have been screwing up in a similar fashion for years, especially in airplanes.

 

The author is a research scientist at NASA, and he specializes in flight-deck procedures. What he tells in this book about airplane automation will make you consider driving or taking the train instead of flying to the next ISA Show.

 

 "By the time a human operator responds to the alarm, an explosion or meltdown might be just seconds away."

 

In spite of its flight control orientation, all process control engineers should read this book. First, you’ll learn about basic HMI problems, such as why those annoying VCR programming screens don’t work or why you can’t set the alarm clock in your hotel room. The author explains how HMI design problems lead to user frustration, confusion, accidents and sometimes death.

 

You will learn many terms and parameters they don’t teach at your DCS vendor’s HMI display configuration classes. Here, you’ll learn about non-deterministic behavior, automatic transitions, coupling, population stereotypes, mode engagement vs. mode activation, walk-in interfaces, states and regions, envelope protection, and "automation surprise."

 

You’ll also learn about the dreaded "automation lock." That’s when an automated system drives itself into an unsafe state where, no matter what it does, failure or disaster will occur. In an airplane, that often means it will crash.

 

In an airplane, for example, the automatic pilot may be correcting for a situation–such as wing icing–but not informing the pilots that it is having a difficult time. When it can no longer keep the plane flying safely, it suddenly disengages, and the plane corkscrews toward the ground. The pilots, who may have been schmoozing with the flight attendants all this time, are suddenly presented with a violently acting airplane in a dangerous flight condition and they have no idea why.

 

The parallel in process control is clear: Suppose a DCS in a process plant is controlling temperature by a combination of coolant flow, agitation, level and pressure. For one reason or another, it reaches the limits of all the controlling variables, but the temperature continues to increase. It has reached automation lock: no matter what it does now, it cannot bring down the temperature, so it sounds an alarm.

 

By the time a human operator responds to the alarm, an explosion or melt-down might just be seconds away. Remember China Syndrome?

 

In cases where the author says human error was involved, there was nothing wrong with the automated system. Even when an airplane crashed or a ship went aground, the automation was doing exactly what it was programmed to do right up to the moment of impact. Its HMI did not, however, warn anyone that it was not doing what its human operators expected it to do. A tiny indicator light that flashes for three seconds is not as informative as alarm bells and horns.

 

In cases when a disaster occurred because of a control problem, the operator interfaces didn’t tell its human operators that a problem existed until it was too late to correct. The HMI should have said: "The airplane is barely under control and I am running at the limit. Please take a look."

 

When a modern automation system runs reliably day after day, users may come to rely on it too much. Users develop an over-trust condition, where they may not monitor the controls any more; worse, the author says, they may even dismiss clues that the automation is not working. An HMI designer cannot let such a situation occur.

 

If your HMI doesn’t tell the entire truth about process control conditions, disaster might be right around the corner. You may want to give your HMIs a second look to see if automation lock or automation surprise is about to happen in your plant. This book will give you many ideas on where to look for problems.

 

Rich Merritt, Senior Technical Editor

rmerritt@putman.net

Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments