CG1408-Error

What Is the Proper Role of Automation?

Aug. 21, 2014
Automated System Can Lead to More and More Complexity and Confusion. Until the Instruments, Automation and Controls get Extremely Reliable, We'll Still Need to Keep Humans in the Loop

This column is moderated by Béla Lipták (http://belaliptakpe.com/), automation and safety consultant and editor of the Instrument and Automation Engineers' Handbook (IAEH). If you have an automation related question for this column, write to [email protected].

Q: The NTSB published its report of the recent Asiana flight that crashed in San Francisco (lat.ms/1rwL89l).

I find the contrast between the views of the NTSB and your views on the subject of using automation on the flight deck of an airliner interesting. According to the NTSB, the pilots didn't understand the automation in place, and they mistakenly believed that it should have done something to arrest their high descent rate. In fact, there is such a mode in the airliner autopilot systems, but that's not the one they had selected.

I have been employed at a large water and sewer utility for more than 28 years. In the mid-1980s we were early adopters of automation. We got started with those lovely old PM550 controllers from Texas Instruments. The first thing we did was to replace those cam stacks and microswitches for sequencing through a filter backwash. The new backwash system was very effective. The operator would push a button and with great reliability, a backwash sequence would happen. Operators could forget about the details, the interlocks, the permissives, and even where the valves and pump controls were. And they did.

In just a few years, most of the operators had forgotten how a backwash worked. Only the senior plant operator, the controls engineer and the plant superintendent remembered why things were done the way they were. And they got more and more grandiose with their designs. We upgraded the control system, and then the superintendent went wild. We had backwash schemes for energy savings, for water savings, for speed, for deep cleaning and other various permutations and combinations. And then he retired. The controls engineer moved on to a new job and pretty soon we had voodoo with a platform that was rapidly becoming more and more obsolete. We were scared to upgrade it (but we are doing just that).

Complexity and the maintenance, management and operation of such complexity often are forgotten in the design of complex systems. You make the point that automation could stop people from making stupid mistakes such as not maintaining speed or turning too sharply or not shutting the plant down properly. And perhaps you're right—but it leads to more and more complexity and confusion.

That's what happened with Air France 447. The airliner Pitot tubes iced up at an altitude where that was not supposed to be possible. The controls reverted to Alternate Law because the automation had no contingencies to handle three wildly different air speed indications at an altitude where the operating range between the wing stall speed and the compressor stall speed can be as little as 12 knots. Had the pilots been more experienced with manual controls, they would have known in a heartbeat what to do. But they had forgotten.

I fly small airplanes on instruments. Manually, I have a constant feel for what my airplane is doing. Yes, my instrument approaches are sloppier than someone's three-axis autopilot with auto throttles. But I know where I am, and I know what is supposed to come next. And because I fly for fun, I know better than to fly when I'm tired, stressed out or ill.

I think that until the instruments, automation and controls get extremely reliable, we'll still need to keep the humans in the loop. Sooner or later that instrumentation or automation will fail. And then, with so little experience working without automation, the human won't know what to do either. That's the lesson I take home from Three Mile Island, from Fukushima, from Air France 447 and so many other disasters.

Jake Brodsky
[email protected]

A: I will say that continuous training is key for the operator. It is true that management needs to be in touch with situations more than ever before, but unfortunately they are not technical.

Hiten A. Dalal
[email protected]

A: Air France 447 (the crash in the Atlantic) was due to Airbus' computers basically giving up flying the plane when Pitot tubes froze, and the computers (five of them) switched to what Airbus calls "Alternate Law," which translates as, "You are the pilot. You fly the plane!" Another tragic example of too much dependence on automation.

I have a friend who was hired as an instructor for Air France, and was turning off the computers in the flight simulators to force the pilots to fly the plane. Airbus management told him "Don't do it. The computer is better than the pilot."

Bob Landman
www.hlinstruments.com

A: Yours is a good summary of the present man-machine relationship and of the state of confusion that prevails.This is not new. During the Industrial Revolution, machines were introduced to substitute for human muscle, people were afraid that machines would cause accidents, and that eliminating work, say blacksmithing, would cause massive unemployment. Just the opposite occurred. Employment increased because these machines had to be designed, operated and maintained. While filling these jobs paid more, the work was less exhausting, and the price of horseshoes dropped. Today, when not the human muscle, but some of the routine functions of the human brains are being delegated to machines, people once again worry about the consequence of excessive dependence on these gadgets, and rightly so because misusing them can cause humans to forget how to do things! Yes, a new generation of "button pushers" could grow up—people who believe that square root means a button on a keyboard, logarithm is an African insect, and professional experience and wisdom is something that you can look up on Google or on Wikipedia.

Having prepared the Instrument Engineers' Handbook for some 50 years, I observed that automation can be both helpful or harmful, depending on how it is used. If ignorant programmers are allowed to prepare fancy software that operators do not understand, but are told to trust "what is in the box," this excessive dependence on something that can be wrong in the first place can create a mess. On the other hand, if we understand the proper role of automation, it makes our industries better and safer! The key is to clearly understand what I call overrule safety control (OSC).

During the past few years, I studied seven major accidents and found that the main cause of one was bad design, the cause of another one was operator inaction due to excessive dependence on automation, and five were caused by various degrees of manual operation of the process without OSC. One example of this "manual operation" was at Three Mile Island when the operator sent water into the instrument air supply, and for hours nobody even realized what had happened. This culture, in an age of poor training and potential for terrorism, needs to change.

Understanding what OSC is is critical! OSC is like a rail barrier. On the one hand, it does not prevent the driver from visiting his mother-in-law, but it does prevent him from causing an accident by trying to get there to taste her excellent cooking too fast. OSC is like automatically keeping the vehicle's doors closed when it is moving and preventing the driver (the operator) from "overruling" that safety automation. OSC is the "red line" that neither the manual operator nor the autopilot must be allowed to cross.

So why is OSC absolutely safe?

  1. Because it overrules not only the unsafe actions of the driver, but also those of the autopilot. In other words, OSC is totally independent of either, and it overrules both! It overrules all unsafe instructions, regardless of whether they come from the operator or from the computer.
  2. Can the OSC fail? Naturally it can, even if it has triple-redundant backups that are using the very best sensors. Yes, it can.
  3. But, if the OSC becomes inoperative for any reason, both the operator and the autopilot continue functioning just as if it did not exist. It is like the safety locks on the car doors or the red light on the street corner. If it fails, you are simply back to normal control.

So what does this mean for Air France 447? It means only two things:

  1. Bad sensors should not be used. Pitot tubes can freeze up, static pressure altimeters can give false information when air density changes (cold fronts, etc.) So forget such ancient sensors, and use redundant radar with GPS backup.
  2. OSC must be on all the time, no matter if the autopilot drops out, and no matter how ignorant or careless the pilot is or what he believes the autopilot is doing. OSC simply prevents both the pilot and the autopilot to attempt landing at unsafe speeds.

In the broader sense, our process control professionals must have a total understanding of the processes they control, must totally separate OSC from regular operational controls, and during the design phase, they must also control the software developers and not the other way around!

For examples of my proposed OSC designs, you can refer to my previous articles about eliminating the possibility of nuclear accidents by using automated underwater nuclear power plants (February 2014, bit.ly/1gLErMK), or you can read my article in the November 2013 issue (bit.ly/1r9s95F) about how OSC would have prevented the BP accident.

Béla Lipták
[email protected]

About the Author

Béla Lipták | Columnist and Control Consultant

Béla Lipták is an automation and safety consultant and editor of the Instrument and Automation Engineers’ Handbook (IAEH).

Sponsored Recommendations

2024 Industry Trends | Oil & Gas

We sit down with our Industry Marketing Manager, Mark Thomas to find out what is trending in Oil & Gas in 2024. Not only that, but we discuss how Endress+Hau...

Level Measurement in Water and Waste Water Lift Stations

Condensation, build up, obstructions and silt can cause difficulties in making reliable level measurements in lift station wet wells. New trends in low cost radar units solve ...

Temperature Transmitters | The Perfect Fit for Your Measuring Point

Our video introduces you to the three most important selection criteria to help you choose the right temperature transmitter for your application. We also ta...

2024 Industry Trends | Gas & LNG

We sit down with our Industry Marketing Manager, Cesar Martinez, to find out what is trending in Gas & LNG in 2024. Not only that, but we discuss how Endress...