By William L. Mostia, Jr., PE, principal, WLM Engineering Co.
THIS ARTICLE is the first of a two-part series that will explore the classifications of human errors, why humans make errors, how errors occur in instrument design, construction, operation, and maintenance, methods to minimize human errors, and what are some of the methods used to quantify human errors.
Human error plays a role in all human activities. We all make mistakes. Not all mistakes are harmful, some cause no problems, and some may actually benefit mankind. But making mistakes in the design, construction, operation, and maintenance of chemical processes can sometimes have costly if not disastrous effects.
Human errors can be classified a number of ways. One is to classify them as errors of commission or omission. Errors of commission means someone did an act that resulted in an error, while errors of omission are where someone did not do something that created an error.
Errors can also be classified as active or latent. In active errors, the error is immediately apparent or the consequence is immediate, while a latent error's consequence is not. Latent error may require time, conditions, or another action before the consequence of the error is apparent.
Errors can also be classified as random human error or where human factors are involved. Random human errors are those that can only be predicted using statistics. Human errors due to human factors means that a procedural factor, management factor, design factor, or some human characteristic facilitated the error. A study of 136 refinery incidents by the Battelle Memorial Institute indicated that human error was involved 47% of the time. Of these, 19% of the errors were random human error while 81% involved human factors.
Human errors can also be classified as to the reason the error was made.
Some feel it is human nature to make errors. This may be true, but there are reasons why people make mistakes, some of which are not under the direct control of the person making the mistake. Understanding these reasons may prevent some mistakes. I've grouped them into three broad categories.
FIGURE 1: TAGGED OUT
Maintenance tags covering one of the emergency feedwater valve indicator lights may have contributed to human error at Three Mile Island nuclear plant.
1. People-Oriented Errors
Slips (Lapses, Execution) Errors--Slips are actions not in accordance with your intentions. Slips occur despite your best intentions to perform a task correctly. They can sometimes come from short-term inattention (internal) or distractions (external). Some of these are due to subconscious processes such as reversing numbers or letters or misspelling words you know how to spell. These errors often occur on seemingly routine tasks.
Capture Error--Occurs when a frequently performed activity is done instead of the desired one. An example is when you desire to go somewhere after work but find that you missed your turn due to being on the "going-home autopilot."
Identification Error--This is where something is misidentified, which leads to error. In a study of incidents in refineries, three-quarters of the human errors involving equipment involved labeling. Current practices of reduced staffing in process plant leads to fewer people covering more area, which leads to more reliance on equipment, piping, and instrument/electrical tagging and identification.
Impossible Tasks--Some tasks are overly complex, very difficult, or even impossible to do. This may result in short cuts or alternative methods which, if not well thought-out, may lead to errors. The more complex a decision or action, the more likely an error will be made. Information overload is a form of complexity. A human can only process so much information at a time, and if too much information is available, then errors may be made. An example is where an abnormal condition generates hundreds or even thousands of alarms that overwhelm the operator. Drawings and DCS screens that are too busy are another example of potential information overloads.
Input or Misperception Errors--Input information to make a decision or perform a task may be incorrectly perceived, perhaps because the presentation of the information is misleading or complex. Overly complex instrumentation systems are an example of this. This also may be because of making assumptions about data that may be missing or being confused about the applicability of information.
Lack of Knowledge--A common source of this type of error is the failure to get the appropriate information from other people or departments. In Operations, it could be a lack of situational awareness from the instruments, particularly during abnormal conditions. Lack of knowledge also frequently leads to assumptions, and there is an old saying that "assumptions are the mother of all screw-ups."
Mindset--Mindset is generally a function of expectations and habits. Humans tend to be creatures of habit and this can lead us afoul. Many times what we see is what we expect even if there is evidence to the contrary, particularly under high stress or time limits. An example could be a valve sequence that is not in a normal order (e.g., open valve 1, close valve 3, open valve 2, open valve 6, close valve 4...), or a sequence is not from left to right (pumps labeled C, B, A from left to right rather than A,B,C), or color coding that does not match the rest of the plant. Mindset can go to the extreme at times where humans believe things to be true even if there is substantial evidence to the contrary. This may be the result of folklore (things that are widely believed to be correct but are not), habit, or faulty experience or rationalization of experience. It can also occur by applying actions done by habit or by past experience to an inappropriate situation. A number of accidents have been the result of the failure of operators to believe their instruments, because "it just can't be true."
Over-Motivation or Under-Motivation--Over-motivation can come from being too zealous, such as completing a job too quickly to please a supervisor or for some other benefit. Working too fast can lead to shortcuts and risk-taking. High-pressure production environments with incentives, for example, can lead to risk-taking and mistakes. Under-motivation may come from lack of morale, either from work or from home. Under-motivation can lead to shortcuts, risk-taking, and not doing work leading to latent errors (for someone else to suffer). Boredom can also lead to under-motivation.
Reasoning Error--This is where a person has the correct information to make a decision or take an action but comes to the wrong conclusion. Lack of training and/or experience are facilitators for this type of error.
Task Mismatches--Sometimes people are mismatched for the task: too small, too big, not enough people, too many people, lack of skill, etc.