Interested in linking to "Human error in instrumentation systems"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
Some examples of this are revising an overly complex procedure to a simpler one; creating a procedure to control safety system bypasses to assure that a bypass is not inadvertently left engaged; placing an interlock where an operator is prevented from taking an action unless some condition is satisfied; and high-level shutdown to prevent an operator from overfilling a tank.
Trevor Kletz, an author on this subject, said, “Some errors can be prevented by better training, or increased supervision, but the most effective action we can take is to design our plants and methods so as to reduce the opportunities for error or minimize their effects.”
3. Tolerance: This is where errors are expected but the system is tolerant of them. An example could be the entry of a wrong number into a human-machine interface but with an operator prompt to verify the number. Another example of this could be back-up systems that protect against a human error.
4. Mitigation: This is where there are systems in place that could mitigate an error. Example of this is a dike around a vessel to contain the liquid from the tank if an operator overfills the tank or a deluge system.
5. Lifecycle Approach: Obviously, it’s best if we can prevent human error altogether, but in general that is an unobtainable goal. However, many errors can be prevented or minimized with proper design of the system, engineering controls, administrative controls, training, and consideration of human factors.
One of the methods of minimizing errors in instrument systems is the lifecycle approach. This is where there is a formal lifecycle for the design, installation, operation, and maintenance of instrument systems. This type of approach can use all the methods to reduce or minimize human error already discussed (and more) but formalizes their usage. An example of a lifecycle approach to a system is given in ISA 84.01, “Application of Safety Instrumented Systems for the Process Industry.”
Cost of Errors
It is difficult to estimate the cost of errors in instrumentation systems. While we many times place the blame for obvious errors, we seldom evaluate the overall cost of errors unless the errors come to the attention of upper management. This is in part because errors are “hidden,” minimizing who knows about them, to prevent negative consequences.
Also, management and supervision often consider a certain amount of errors unavoidable, treated as sort of a cost of doing business, but not something they can really control. People are yelled at, chastised, supervised, punished, criticized, etc., but there are many companies that have essentially no quality system to work at reducing errors for instrument design, installation, operation, and maintenance. It is assumed that the supervisor and the normal managerial system will minimize these errors, but that is seldom efficient in the long-term reduction of errors.
In order to quantify the probability of human error, we must somehow quantify the propensity of humans to make errors under the conditions of interest. Since we are dealing with the complexity of human actions, this is somewhat difficult.
Several methods have been developed. Some are Human Error Assessment and Reduction Technique (HEART), Technique for Human Error Rate Prediction (THERP), and Empirical Technique to Estimate Operator Errors (TESEO). A discussion of these methods can be found in Reference 6.
HEART, as an example, is deterministic and fairly straightforward. It was developed by J.C. Williams in the early 1980s. HEART quantifies human error into a probability of the error, an error producing condition multiplier, and a proportioning effect. The first two of these are provided in tables while the proportioning effect is determined by the experience of the person doing the analyses.
There have been reports that the introduction of automatic protections has actually raised the amount of human error. One conclusion is that with known automatic protections in place, operators may be prone to more risk taking either individually or in their operating philosophy. If true, this merits close evaluation of the human factors involved.
In conclusion, human error occurs in instrument systems all the time. And human factors play a large part in facilitating human error. The cost of human error can be high and there can be a substantial impact on safety.
It cannot be assumed that normal management or supervisory systems will reduce or minimize human errors. Indeed, they may create human factors that actually facilitate human error, and they may not have any formal methods to reduce errors.
|About the Author|
ControlGlobal.com is exclusively dedicated to the global process automation market. We report on developing industry trends, illustrate successful industry applications, and update the basic skills and knowledge base that provide the profession's foundation.