Automation, Operators and the bottom line

Automation brings operator challenges as well as benefits. Addressing them means rethinking both control rooms and training.

By Ian Nimmo

Share Print Related RSS
Page 1 of 2 « Prev 1 | 2 View on one page

By Ian Nimmo

Today technology has pushed industrial processes to the limit while optimizing them for maximum efficiency and the most cost-effective use of energy. State-of-the-art plants are monitored by sophisticated “smart” sensors that even calibrate themselves. They employ process control devices that tune themselves based on plant situations, and computers that can predict and optimize based on market requirements. But they have an Achilles heel.

I don’t believe that technology gone beyond the capabilities of our current operators, but I do think we neglect to ensure safe production. Omissions in the design of the interface between the human and the machine leave holes in our defenses, allowing human error to steal the very profits we worked so hard to get through this technology.

The work of the Honeywell-backed Abnormal Situation Management Consortium shows that abnormal events are still costing industry $20 billion per year for the petrochemical industry alone. The petrochemical industry experiences roughly 7.3 days of unplanned shutdowns per year, costing on average $250,000 per hour, due to incidents. The total cost of 7.3 days lost productivity is $43 million dollars, not all of which is preventable. A 5% improvement of these figures would produce an extra $2,150,000 per year.

The consortium found that in most plants not meeting their production targets, 8% to 12% of the loss was due to preventable abnormal situations. It is easy to see losses that take away from the bottom line, but a hidden cost lurks operators back the process away from its full potential so they can operate at a more comfortable and less stressful level.

Automation Ironies

Operators struggle to work in badly designed central control rooms. The number-one issue is stress, which impacts the operator’s ability to perform, often leads to shift-work related illnesses and is the primary reason operators give when declining jobs as console operators.

What are the missing control-room design elements and why do we neglect them? British engineering psychologist, Lisanne Bainbridge outlined them in her article “Ironies of Automation,” in New Technology and Human Error, J. Rasmussen, K. Duncan and J. Leplat, eds. (Wiley, 1987).

The first irony, Bainbridge says, is that by taking away the easy parts of operator’s task, automation can make the difficult parts of the job even more difficult.

Second, while many system designers regard human beings as unreliable and inefficient,  they still leave people to cope with those tasks the designer could not think how to automate—most especially, the job of restoring the system to a safe state after some unforeseen failure.

Third, in highly automated systems, the task of the human operator is to monitor the system to ensure the “automatics” are working as they should. But even the best motivated people have trouble maintaining vigilance for long periods of time—say, 12-hour shifts. They are thus ill-suited to watch out for these rarer abnormal conditions.

Fourth, skills need to be practiced continuously to be kept sharp. Yet an automatic system that fails only very occasionally denies human operators the opportunity to practice the skills they will need in an emergency. Thus, they can become de-skilled in just those abilities that justify their marginalized existence.

Bainbridge concludes, “Perhaps the final irony is that it is the most successful automated systems with rare need for manual intervention which may need the greatest investment in operator training.”

The Real Cost of Bad Design

Texaco Pembroke had a serious incident and explosion in 1994 that affected hundreds of people, seriously injured twenty-six employees and caused damage of nearly $100 million (£48 million). According to the Health & Safety Executive’s investigation, the major factors that contributed to this incident were

  • Too many alarms that were poorly prioritized;
  • Control room displays did not help the operator understand what was happening;
  • Operators inadequately trained to dealing with a stressful and sustained plant upset;
  • A work environment that contributed to disruptions and stress.

In an incident at the Esso Longford Gas Plant in September, 1998, 10 tonnes of flashing hydrocarbon were released and exploded. The explosion killed two, injured eight, and dug a crater 1.5 meters deep. The control room was evacuated, restricting the operator’s ability to safely shut the unit down. The estimated loss to the industry was $1.3 billion.

Bainbridge and others studying these incidents concluded that training and continuously practicing skills was critical to success. The design of the operator workspace has a direct correlation on the operator’s ability to perform to the standards required for successful intervention of problems. However, management continues to neglect training, giving it a low priority, and continues to support equipment shelters rather than a building that supports the multiple activities the operators perform  throughout a 24-hour operation.

Four Stages of Intervention

Successful intervention involves four stages: orienting, evaluating, acting and assessing.  

Orienting involves perceiving the exact problem. Operators can receive many alarms, and due to their poor prioritization—and often an avalanche of them—during a process disturbance, their task becomes difficult. They  must process the data by working with the human-computer interface (HCI) to achieve the next goal, evaluation.

This process has been named “situation awareness.” To achieve it, designers must focus on the control room’s functional layout and the consoles, ensureing adjacency for good communications and collaboration, and on the HCI, its detail and navigation method.

This stage can be dramatically improved by following the practices utlined in the EEMUA publication 191 Guideline for alarm management, the style guide for HCIs outlined in EEMUA publication 201, and good workspace design practices as outlined in the ISO 11064 Standard for Ergonomic Design of Control Buildings.

Page 1 of 2 « Prev 1 | 2 View on one page
Share Print Reprints Permissions

What are your comments?

You cannot post comments until you have logged in. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments