Cockpit Flight Control; PCV with Closed Failure Position

Readers asks Our Experts About Safe and Economic Self-Regulating Valves, and Advancements in Cockpit Flight Control Systems

By Bela Liptak

Share Print Related RSS
Page 1 of 2 « Prev 1 | 2 View on one page

Q: As I understand it, searchers finally located the final resting place of Air France Flight 447 in 2011 and were also able to recover some of the victims and all of the flight data recorders. It seems that the freezing of the Pitot tubes were indeed the root cause of that accident and later, after the cascade of failure began, there were conflicting control inputs from the copilots until the captain realized (too late) what was happening.

Prior to working here, I used to work in the missile defense industry. We have the ability to accurately hit anything, anywhere with kill vehicles traveling at around 15,000 miles an hour. I believe these technologies can be employed on aircraft as primary or secondary telemetry data sources. I was just wondering if there has been any further discussions or developments on the subject of cockpit flight control systems advancements?

Mark Mason
Mark.Mason@mustangeng.com

A: My review indicates that the frozen Pitot tubes played an important role in the Air France Flight 447 tragedy in 2011, and I am also convinced that the Asiana Boeing 777 crash in San Francisco (Korean Air Lines Flight 214) could have been prevented by applying the very basics of automatic safety control, which would have overruled, in one case the actions of the automatic cockpit controls and in the other the copilots’ inaction. As to Pitot tubes, in recent years, there has been some progress in converting to the use of more reliable and redundant speed detectors. On the other hand, the addition of automatic "overrule safety" controls has still not occurred, both because of ignorance and because of cost considerations.

What is meant by "overrule safety"? It refers to the automatic action that overrules all other controls, manual or automatic, and protects the system no matter what. In the processing industries, we have long applied this philosophy by, for example, providing pressure safety valves which cannot be turned off by anything or anybody. Similar "overrule safety" will probably be applied to underwater nuclear reactors, which cool automatically by thermal expansion opening and gravity-loading cooling water, without any valves or pumps. It is time for the transportation industry to also understand and accept automatic "overrule safety" controls that operate just like safety relief valves on boilers or air bags in a cars, in that they cannot be deactivated by anything or anybody.

By the way, the same applies to trains where automatic "overrule safety" controls (ATC) would also be essential. Such systems must automatically limit the maximum speed, based either just on the speed limit at the particular location or can also consider rail curvature, inertia (load on the train), push or pull mode of operation, weather conditions, wind direction, etc. The key is that it is active all the time, and its activation requires no action on the part of the engineer, nor can he overrule it.

Yes, transportation safety technology is available right now. What is missing is the willingness to make the investment needed to add the needed "overrule safety" automation. It is bordering on the ridiculous that, on the one hand, our GPS can measure the location and speed of any vehicle, or that some vendors are considering the use of automatic mini-drones to deliver pizzas, while others feel that automatically limiting the speed of trains or airplanes is too complicated or costly and can be left to bad operating controls and/or to untrained or sleepy engineers and pilots. It is the responsibility of our profession, that of the International Society of Automation, to bring this industry too into the 21st century.

Béla Lipták
Liptakbela@aol.com

A:  Personally, I have found an automatic system that is on by default, but is manually overrideable when needed, to be of most value. However, I also think it really depends on the process under control, because some are just not safely (or even at all) operable in the manual mode. In any case, I think such design decisions should be made on a case-by-case basis and by persons with enough experience/knowledge of the process to reasonably evaluate the pros and cons.

Never having flown a plane myself, I would not be so sure that non-overridable auto speed is the way to go. Some type of warning of the slow speed and that the auto-speed control was only "armed" might be more reasonable.

By the way, I personally really do not like some of the latest air bag safety functions I have come across. For instance, I have been really annoyed after being stymied by the transmission position/brakes interlock when trying to restart an engine that died in traffic. And, although I realize it is not really a fault of the automatic control logic, how about those regularly failing ($900 without installation) BMW passenger seat occupancy sensors and the fact, in my opinion, that such sensors are not atypical?

Al Pawlowski
avp2@almont.com

A: I completely agree that we have the technology to prevent accidents like that. There is a large body of work concerning cockpit automation, under the heading of Situational Awareness. Mica Endsley has done some excellent work. Wikipedia has a good article on Situation Awareness that has lots of references to other work.

Page 1 of 2 « Prev 1 | 2 View on one page
Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

  • The NTSB published their report of the recent Asiana flight that crashed in San Francisco.

    http://www.latimes.com/local/la-me-asiana-crash-hearing-20140625-story.html

    I find the contrast between the views of the NTSB and your views on the subject of using automation on the flight deck of an airliner interesting. According to the NTSB, the pilots didn't understand the automation in place, and that they mistakenly believed that it should have done something to arrest their high descent rate. In fact, there is such a mode in the airliner autopilot systems, but that's not what they had selected.

    Allow me to share some of my experiences with the human side of automation:

    I have been employed at a large water and sewer utility for more than 28 years. In the mid 1980s we were early adopters of automation. We got started with those lovely old PM550 controllers from Texas Instruments. The first thing we did was to replace those cam stacks and microswitches for sequencing through a filter backwash.

    The new backwash system was very effective. The operator would push a button and with great reliability, a backwash sequence would happen. They could forget about the details, the interlocks, the permissives, and even where the valves, and pump controls were. And they did.

    In just a few years, most of the operators had forgotten how a backwash worked. Only the senior plant operator, the controls engineer, and the plant superintendent remembered why things were done the way that they were. And they got more and more grandiose with their designs.

    We upgraded the control system and then the superintendent went wild. We had backwash schemes for energy savings, for water savings, for speed, for deep cleaning, and other various permutations and combinations. And then he retired. The controls engineer moved in to a new job and pretty soon we had voodoo with a platform that was rapidly becoming more and more obsolete. We were scared to upgrade it (but we are doing just that).

    Complexity and the maintenance, management, and operations of such complexity often often forgotten in the design of complex systems.

    You make the point that automation could stop people from making stupid mistakes like not maintaining speed, or turning too sharply, or not shutting the plant down properly. And perhaps you're right --but it leads to more and more complexity and confusion.

    That's what happened with Air France 447. The airliner Pitot tubes iced up at an altitude where that was not supposed to be possible. The controls reverted to Alternate law because the automation had no contingencies to handle three wildly different air speed indications at an altitude where the operating range between the wing stall speed and the compressor stall speed can be as little as 12 knots.

    Had the pilots been more experienced with manual controls they would have known in a heartbeat what to do. But they had forgotten.

    I fly small airplanes on instruments. Manually. I have a constant feel for what my airplane is doing. Yes, my instrument approaches are sloppier than someone's three axis autopilot with auto throttles. But I KNOW where I am and I know what is supposed to come next. And because I do this for fun, I know better than to fly when I'm tired, stressed out, or ill.

    I think that until the instruments, automation, and controls get extremely reliable, we'll still need to keep the humans in the loop. Sooner or later that instrumentation or automation will fail. And then, with so little experience working without automation, the human won't know what to do either. That's the lesson I take home from Three Mile Island, from Fukushima, from Air France 447, and so many other disasters.

    Thank-you for all the good work you've done, Mr. Liptak. You remain a giant in this industry to our engineering staff at the Washington Suburban Sanitary Commission.

    Jake Brodsky jakebrodskype@gmail.com

    Reply

  • Thanks Jake, You summed up the present cultural-state of the human-machine relationship perfectly and I fully agree that our society is in a state of confusion. The new generation of „button pushers” are growing up believing that it is good enough if Google and Wikipedia is smart, they do not nee to be . Having said that, we must not let programmers run wild. Our role, the role of control enginers is esential, because we know how to keep a pipe straight (BP) or how to build a nuclear power plant under water so that it needs no man made energy to be absolutely safe. If our knowledge is properly applied, automation can make industry safer by preventing panicked or ignorant operators (or terrorists) from doing unsafe things. Overrule safety controls (OSC) are like red light or lift gates on street corners because on the one hand they do not prevent the driver to say visiting his mother-in-law, but they do protect both him and others. So what does this mean for Air France 447? It means only two things: 1. It means that bad sensors should not be used. Pitot tubes can freeze up, static pressure based altimeters can give false information when air density changes (cold fronts, etc.) So forget such ancient sensor and use radar or GPS. 2. OTC must be on all the time and must not depend on what the pilot believes or what he selects. The pilot, - just like the person crossing the street - must not be allowed to turn off the “red light”. OSC must be on all the time and prevent the pilot from doing stupid things. Best Regards, Béla

    Reply

RSS feed for comments on this page | RSS feed for all comments