CG1410-RoboticHand

Reader Feedback: Who Do You Trust? People or Machines?

Oct. 17, 2014
If We Trust the Machine, Are We Not Trusting Another Human to Correctly Program the Machine to Always Take Corrective Action?

In the "Ask the Experts" department in  Control's August 2014 issue, "The Proper Role of Automation?" (p. 54), Béla Lipták references the 1979 TMI and the 2009 Air France 447 accidents to propose adding additional critical control and safety systems, or modifying current systems to prevent operator action that overrides the safety function (OSC).

The root cause of these and many other accidents was incorrect or inconsistent, redundant information presented to the digital control system and operator. The solution was to correct/improve the sensor design/installation, and improve operator training and management oversight.

As there are many applications in the hydrocarbon processing, power and transportation industries that have no safe failure mode, and that digital platforms (BPCS, SIS, etc.) can and have failed unpredictably (outputs on, off, cycling, etc.), I would not trust such systems to override operator action. When the digital device fails to increase/decrease level, cooling, speed, etc., should not the operator have the ability to take manual control?

The fundamental question is "What or whom do you trust—the operator or the machine?" If the machine, are we not trusting another human to correctly program the machine to always take corrective action? With the introduction of artificial intelligence controlling all sorts of vehicles, each of us will soon face this question.

J. Troy Martel
Safe Operating Systems Inc.