Automation continues to expand into manufacturing processes, driven by the need for more production with reduced costs. As it expands, it grows in complexity as the technology available for process equipment and control becomes more complex and cost-effective. As complexity grows, the process becomes more hidden from the operators, who are supposed to deal with unforeseen problems as they arise.
The problems associated with opaque automation are already with us. The most common way to obscure a process is with a flood of alarms, so good work is being done in the alarm management field. Early fly-by-wire aircraft flew into terrain when the pilots were unable to correct what the automation was doing because they didn't understand what was going on. The investigators blamed the crashes on opaque automation.
A continuous process doesn't have much automation aside from its alarms and interlocks. The emphasis is on making the measurements of the state of the process available to operators, while those measurements are held to setpoints by controllers. This is changing as ways to apply procedural control are being developed. It's not that continuous processes don't have procedures—the standard operating procedures manual for a process is full of them. Only recently has the procedural control developed for batch processes been considered for use with continuous processes.
Automating process procedures can easily lead to opaque automation when the procedure can go in different directions, depending on alarms and changes in the process. An operator who has no idea what will happen next has a limited ability to keep the process out of trouble. The process equivalent of flight into terrain often leads to a fire and possible explosions in a chemical process. Discrete processes suffer loss of production when part of a machine breaks, and the debris falls into a gearbox.
The principle reason why procedures become opaque is that they're designed by clever (not to say overly inventive) engineers and programmers, who are focused on getting the job done without considering the need for a clear human interface. The result is inscrutable logic encoded into procedures that are not readable by ordinary mortals, assuming that most people don't know a database from first base.
Change is required in the way that procedures are designed, but people don't want to change if what they have works. They have to be shown that what they think works really isn't working, and that can get expensive. Some towns don't install a traffic light until someone is killed at the intersection.
One reason that procedures are encoded is efficiency of computation. Years ago, programmers resorted to assembly language programs to save expensive memory and speed up slow applications. Today's computers have several orders of magnitude more memory and speed. Tomorrow's computers could make today's look ridiculous, especially if quantum weirdness can be tamed. Efficiency of computation is not an issue with automation-scale applications.
Another factor that complicates procedures is the number of translations that must occur between the user's requirements and the functioning machine. An engineer must understand what the user wants, which requires an engineer who has worked as a user. A programmer must understand what the engineer wants, but it is more difficult to find programmers who understand anything but getting code to work in a computer. The problem must be reduced to mathematics and branching tests.
The U.S. FDA requires a paper-laden trail through the V-model (Figure 1) to assure that software will do what the user required. Users aren't always good at defining exactly what they want ("I'll know it when I see it"), which leads to multiple iterations of the V-model until the result looks like what the user wants. However, that may change when the user tries to use it, and discovers flaws in the human interface.
The situation today is not unlike the early days of telegraphy when a user drafted a message and took it to a telegraph office. A telegrapher, who knew how to get the message to its destination, translated it to Morse code. The receiving telegrapher decoded it, and gave it to a messenger who took it to the recipient. Then the telephone was invented, and most of that structure went away. The sender could deliver the message to the receiver directly. Well, "directly" if you don't count wiring, switching, trunk lines and central offices that made it possible. Obsolete Pony Express riders could say "I told you so" to the telegraphers.
Natural Language Should Be the Norm
First, it's necessary to stop thinking of controlled process equipment as computer peripherals. Process equipment is designed and purchased to provide process functions, such as mixing, distilling, heating, machining, assembly and packaging. The equipment is inert until it's controlled by a human, or automated by a computer or other machine.