JohnRezabek

Listen to weak signals

Jan. 24, 2019
The whispers we are wont to ignore may be harbingers of situations we want to avoid

We were headed to the airport for a three-day business trip and failed to notice the faint click-click-click, barely audible above the radio. The enclosed levels at the garage were full so we ended up parking on the roof, open to the elements of the midwestern winter. The return flight from Houston arrived after 2200 hours, and another hour drive awaited us upon return. It was looking up when no freezing rain or snow had accumulated in our absence, but we were dismayed to find our left front tire was flat.

Failing to notice “weak signals”—the characteristic sound of a nail in the front tire, for example—can lead to greater calamities than a flat tire late on a winter’s night. That’s one key concept presented by Doug Rothenberg at a recent meeting of the Cleveland, Ohio ISA chapter. Abnormal Situation Management guru and author of what’s been called the most comprehensive treatise available, “Alarm Management for Process Control,” Rothenberg is soon to publish a new book illuminating further the discipline required for “Abnormal Situation Avoidance.”

Rothenberg points out that there are many weak signals in our daily endeavors we aren’t inclined to process. We abhor uncertainty or ambiguity, so our brains work to rationalize or dismiss many weak signals. Humans are naturally biased against seeing that the emperor has no clothes, or accepting any indication that conflicts with convention or comfort. It takes effort and discipline to bring weak signals into consciousness. We are quick to say, “this is what it means;” we should really think, “What could it really mean?” he advises.

We experience many instances of partial information that we are quick to rationalize. We love to create accommodating explanations with incomplete information, Rothenberg points out. Two measurements that normally agree start to deviate. “Those instruments are unreliable—most likely it’s just drift.” Life goes on. The waste heat boiler/incinerator starts losing temperature and using more fuel. We don’t want to think this might mean something dire is happening upstream, so, “Oh, Hidalgo is on, he always struggles with that boiler control.” But Rothenberg says, stand up, notice, pay attention—the signals are talking.

For decades, our company has trained engineers in the Kepner Tregoe problem solving methodology. With K-T, as we call it, a problem is examined and a disciplined path is followed to identify observations that support or refute various hypotheses. Rothenberg’s weak signal technique offers the possibility of avoiding the problem—the consequence—before experiencing it. “Real problems can’t hide for long, so how to find them early enough?” he asks. A weak signal—anything the least bit out of the ordinary—could be the somber mood of a normally cheerful operator, or even a gut feeling—weak signals don’t have to be restricted to measurements just because we’re instrument specialists. But instruments could be extremely useful when seeking to confirm or refute a hunch or a theory for the cause of a weak signal you happen to have noticed.

One methodology Rothenberg explores is to choose from the myriad weak signals one might observe, those few that are clearly or possibly more than noise. Consider, then, what might they indicate? And of those potential outcomes, which are the most dire? From there, the observer can seek other indications that confirm or refute the potential outcome, sort of like doing Kepner-Tregoe on a consequence that hasn’t befallen you yet.

It’s not that we don’t do such diagnostics from time to time, but we tend to focus on the known abnormal conditions. But as Rothenberg illustrates, our processes don’t just exist in the known normal state and the known abnormal state (which we detect with alarms)—they also wander into the unknown and not normal—and it’s these places where we haven’t been yet, where especially dire consequences, the so-called black swan, might be approaching.

While there is some hyperventilating about Big Data and the IIoT, thoughtful examination of the data—noticing what we already have—can focus our attention on the blind spots where an avoidable consequence might be revealed or confirmed. Rothenberg’s soon to be published book, “Situation Management for Industrial Operations” (Wiley & Sons) will give us some suggestions where our human senses and minds can be trained to see past our preconceptions—and cut through the clutter.

About the author: John Rezabek
About the Author

John Rezabek | Contributing Editor

John Rezabek is a contributing editor to Control