Voices: Lipták

What Is the Proper Role of Automation?

Automated System Can Lead to More and More Complexity and Confusion. Until the Instruments, Automation and Controls get Extremely Reliable, We'll Still Need to Keep Humans in the Loop

By Bela Liptak

Q: The NTSB published its report of the recent Asiana flight that crashed in San Francisco (lat.ms/1rwL89l).

I find the contrast between the views of the NTSB and your views on the subject of using automation on the flight deck of an airliner interesting. According to the NTSB, the pilots didn't understand the automation in place, and they mistakenly believed that it should have done something to arrest their high descent rate. In fact, there is such a mode in the airliner autopilot systems, but that's not the one they had selected.

I have been employed at a large water and sewer utility for more than 28 years. In the mid-1980s we were early adopters of automation. We got started with those lovely old PM550 controllers from Texas Instruments. The first thing we did was to replace those cam stacks and microswitches for sequencing through a filter backwash. The new backwash system was very effective. The operator would push a button and with great reliability, a backwash sequence would happen. Operators could forget about the details, the interlocks, the permissives, and even where the valves and pump controls were. And they did.

In just a few years, most of the operators had forgotten how a backwash worked. Only the senior plant operator, the controls engineer and the plant superintendent remembered why things were done the way they were. And they got more and more grandiose with their designs. We upgraded the control system, and then the superintendent went wild. We had backwash schemes for energy savings, for water savings, for speed, for deep cleaning and other various permutations and combinations. And then he retired. The controls engineer moved on to a new job and pretty soon we had voodoo with a platform that was rapidly becoming more and more obsolete. We were scared to upgrade it (but we are doing just that).

Complexity and the maintenance, management and operation of such complexity often are forgotten in the design of complex systems. You make the point that automation could stop people from making stupid mistakes such as not maintaining speed or turning too sharply or not shutting the plant down properly. And perhaps you're right—but it leads to more and more complexity and confusion.

That's what happened with Air France 447. The airliner Pitot tubes iced up at an altitude where that was not supposed to be possible. The controls reverted to Alternate Law because the automation had no contingencies to handle three wildly different air speed indications at an altitude where the operating range between the wing stall speed and the compressor stall speed can be as little as 12 knots. Had the pilots been more experienced with manual controls, they would have known in a heartbeat what to do. But they had forgotten.

I fly small airplanes on instruments. Manually, I have a constant feel for what my airplane is doing. Yes, my instrument approaches are sloppier than someone's three-axis autopilot with auto throttles. But I know where I am, and I know what is supposed to come next. And because I fly for fun, I know better than to fly when I'm tired, stressed out or ill.

I think that until the instruments, automation and controls get extremely reliable, we'll still need to keep the humans in the loop. Sooner or later that instrumentation or automation will fail. And then, with so little experience working without automation, the human won't know what to do either. That's the lesson I take home from Three Mile Island, from Fukushima, from Air France 447 and so many other disasters.

Jake Brodsky
jakebrodskype@gmail.com

A: I will say that continuous training is key for the operator. It is true that management needs to be in touch with situations more than ever before, but unfortunately they are not technical.

Hiten A. Dalal
hiten_dalal@kindermorgan.com

A: Air France 447 (the crash in the Atlantic) was due to Airbus' computers basically giving up flying the plane when Pitot tubes froze, and the computers (five of them) switched to what Airbus calls "Alternate Law," which translates as, "You are the pilot. You fly the plane!" Another tragic example of too much dependence on automation.

I have a friend who was hired as an instructor for Air France, and was turning off the computers in the flight simulators to force the pilots to fly the plane. Airbus management told him "Don't do it. The computer is better than the pilot."

Bob Landman
www.hlinstruments.com

A: Yours is a good summary of the present man-machine relationship and of the state of confusion that prevails.This is not new. During the Industrial Revolution, machines were introduced to substitute for human muscle, people were afraid that machines would cause accidents, and that eliminating work, say blacksmithing, would cause massive unemployment. Just the opposite occurred. Employment increased because these machines had to be designed, operated and maintained. While filling these jobs paid more, the work was less exhausting, and the price of horseshoes dropped. Today, when not the human muscle, but some of the routine functions of the human brains are being delegated to machines, people once again worry about the consequence of excessive dependence on these gadgets, and rightly so because misusing them can cause humans to forget how to do things! Yes, a new generation of "button pushers" could grow up—people who believe that square root means a button on a keyboard, logarithm is an African insect, and professional experience and wisdom is something that you can look up on Google or on Wikipedia.

Having prepared the Instrument Engineers' Handbook for some 50 years, I observed that automation can be both helpful or harmful, depending on how it is used. If ignorant programmers are allowed to prepare fancy software that operators do not understand, but are told to trust "what is in the box," this excessive dependence on something that can be wrong in the first place can create a mess. On the other hand, if we understand the proper role of automation, it makes our industries better and safer! The key is to clearly understand what I call overrule safety control (OSC).

During the past few years, I studied seven major accidents and found that the main cause of one was bad design, the cause of another one was operator inaction due to excessive dependence on automation, and five were caused by various degrees of manual operation of the process without OSC. One example of this "manual operation" was at Three Mile Island when the operator sent water into the instrument air supply, and for hours nobody even realized what had happened. This culture, in an age of poor training and potential for terrorism, needs to change.

Understanding what OSC is is critical! OSC is like a rail barrier. On the one hand, it does not prevent the driver from visiting his mother-in-law, but it does prevent him from causing an accident by trying to get there to taste her excellent cooking too fast. OSC is like automatically keeping the vehicle's doors closed when it is moving and preventing the driver (the operator) from "overruling" that safety automation. OSC is the "red line" that neither the manual operator nor the autopilot must be allowed to cross.

So why is OSC absolutely safe?

  1. Because it overrules not only the unsafe actions of the driver, but also those of the autopilot. In other words, OSC is totally independent of either, and it overrules both! It overrules all unsafe instructions, regardless of whether they come from the operator or from the computer.
  2. Can the OSC fail? Naturally it can, even if it has triple-redundant backups that are using the very best sensors. Yes, it can.
  3. But, if the OSC becomes inoperative for any reason, both the operator and the autopilot continue functioning just as if it did not exist. It is like the safety locks on the car doors or the red light on the street corner. If it fails, you are simply back to normal control.

So what does this mean for Air France 447? It means only two things:

  1. Bad sensors should not be used. Pitot tubes can freeze up, static pressure altimeters can give false information when air density changes (cold fronts, etc.) So forget such ancient sensors, and use redundant radar with GPS backup.
  2. OSC must be on all the time, no matter if the autopilot drops out, and no matter how ignorant or careless the pilot is or what he believes the autopilot is doing. OSC simply prevents both the pilot and the autopilot to attempt landing at unsafe speeds.

In the broader sense, our process control professionals must have a total understanding of the processes they control, must totally separate OSC from regular operational controls, and during the design phase, they must also control the software developers and not the other way around!

For examples of my proposed OSC designs, you can refer to my previous articles about eliminating the possibility of nuclear accidents by using automated underwater nuclear power plants (February 2014, bit.ly/1gLErMK), or you can read my article in the November 2013 issue (bit.ly/1r9s95F) about how OSC would have prevented the BP accident.

Béla Lipták
liptakbela@aol.com

More from this voice

Title

Automation Could Have Prevented Fukushima, 2

Bela Liptak Discusses Automatic vs. Manual Operation of the Emergency Cooling Systems, and the Roles the Bad Designs of Control and Block Valves Played in this Nuclear Accident

04/30/2013

Sealing an Oil Well; Orifice Sizing--OSO versus AGA

Can Someone Explain BP's Sealing Process to Temporary Seal the Oil Well from the BP Oil Spill in the Mexican Gulf? And, What Is the Difference Between the Orifice Calculations Described by AGA 3 and ISO 5167?

04/10/2013

Automation Could Have Saved Fukushima

Liptak Says That If the Fukushima Level Detectors Had Operated Correctly, the Hydrogen Explosions Would Have Been Prevented

03/15/2013

Loop Drawings for Smart Instruments

Readers Look to Our Experts for Information on Smart Instruments

03/12/2013

Automation Could Have Prevented Chernobyl

Bela Liptak Tells How Automatic Safety Controls Could Have Prevented the Accident at Chernobyl

01/10/2013

Cascade, Scan Time, PID Tuning

When Should You "Slow Down" a PID Loop by Making the Loop Update Time Such That the Loop Executes Less Often?

01/09/2013

Emergency Shutdown of LPG Tank Farms

Is It Allowed from the Standard Point of View to Use the Existing Level Transmitters to Control the Inlet and Outlet Shutdown Valves?

12/10/2012

Real-Time PIG Data Transmission?

How Can We Improve the Safety of the Pipelining Industry? See What Our Experts Say

11/12/2012

Why Nuclear Needs Process Automation

The Key to the Safety of Nuclear Power Plants Is to Maintain the Availability of Coolants, Even if All Electric Power Supplies Fail

11/01/2012

Crude Oil Pipeline Control

Is It Possible to Maintain an Accurate and Stable Pressure Control Just Using Speed Control?

09/27/2012

Economic Controls; Cooling Tower Optimization

Readers' Ask Our Expert, Bela Lipatk, What His Opinion Is on Keen's Approach to Debuning Economics and to Give Some Advice on the Conceptual Design of Cooling Tower Optimization

09/10/2012

Improving Oil and Gas Well Safety

Liptak Walks Us Through Step-by-Step How Process Control Can Improve the Safety of Fracking, Off-Shore Drilling, Well Blow-Out Prevention, Drilling Ship Stability and Much More

08/29/2012

"Herding" Control and LPG Problems

A Reader Asks Us if "Herding Control" Is a Theoretical Concept, or a System that Our Very Own Bela Liptak Designed. Also, What Is the Suitability of Using a Differential Pressure Cell Pressure Transmitter.

08/10/2012

D/P Cell Accuracy; Which Manifolds Are Best?

A Reader Aks our Panel of Experts How He Can Tell If His D/P Transmitter is Accurately Calibrated. See What Our Experts Had to Say

07/16/2012

Drilling Safely in the Arctic Ocean

Liptak Shows Why Full Automation Could Improve the Safety of Offshore Drilling in and Transportation from the Arctic Ocean

07/03/2012

Valve Failure Positions; Controlling Global Processes

Readers Ask About PID Controller Action, Valve Failure Positions and Global Process Trends for Industrial Applications

06/12/2012

Tuning Interacting Controllers

Is the Famous Ziegler-Nichols (ZN) Open-Loop Tuning/Closed-Loop Tuning Parameter Calculation for Interacting or Non-Interacting PID?

05/14/2012

Controls for Drilling Oil and Gas Wells

The Drilling for Oil and Gas Will Be With Us for Some Time, the Only Contribution Our Profession Can Make Is to Improve the Safety of These Processes

05/04/2012

Magmeter Maintenance, Feed-Forward Variations; Robots

What Maintenance Is Required for a Magmeter? And What's the Difference Between "Positional Feed-Forward" and "Incremental Feed-Forward" in PID Control?

04/11/2012

Using Feed-Forward PID for External Reset

Is There a Way to Create an External Reset on a Standard PID Feedback Controller?

03/08/2012