Situation Critical

Bad Human/System Relationships Can Quite Literally Blow Up in Everyone's Face

Share Print Related RSS
Page 3 of 4 1 | 2 | 3 | 4 View on one page

In the aircraft industry there are some sound definitions of "situation awareness" and some sound engineering principals that have resolved many human error issues experienced by the aircraft industry. We have also learned from aviation's experience that trying to automate a way out of this problem is fraught with new problems, such as increased system complexity, loss of situation awareness, system brittleness and workload increases at inopportune times. The attempt to automate a way out of so-called human error has only led to more complexity, more cognitive load and catastrophic errors associated with losses of situation awareness (Endsley & Kiris, 1995).

So to address this issue it is critical to first understand the full problem. The industry must examine why operators have failed in the past and be critical of the systems given to them and try to understand why these systems are not successful. This is extremely difficult as it clashes with many cultural issues and attacks many of the working practices that have evolved with the industry. What many engineers today hold onto as a good practice may be revealed as poor or bad practice.

It was a revelation to the author, as a traditional engineer who came up through the electrical engineering background, learned instrumentation and evolved into a control engineer after first working with early computers that evolved into what we know as DCS systems today, that the fundamental system design and growth through evolution was flawed.

In Mica R. Endsley's book Designing for Situation Awareness: An Approach to User-Centered Design, she defines a technology-centered design that takes traditional sensors and systems that are needed to perform functions, then added a display for each system that informed the operator of how well that particular system was operating or its present status. As the design evolved systems kept on being added until the operator displays grew exponentially. The operator was expected to be able to find, sort, integrate and process through the vast array of information that is available, leading inevitably to an information gap. It was never even considered that the human has limitations, or that the human could become the bottleneck. As the display of data in these systems is centered on technologies producing them, it is often scattered and not ideally suited to support human tasks.

An alternative was never considered. Engineers were aware that in the old days when they first started instrumentation, they had panels with instruments mounted on them. The instruments were arranged around the operator's tasks so that the operator would not have to run up and down the panel every time he or she had to do a task. This was a better solution, though it had limitations in that the panel only had a certain amount of room for equipment, and change was a difficult task. So was adding new equipment once the design was complete (built-in management of change, MOC).

However, user-centered design goes much further than this basic concept of task grouping. It considers displaying information in ways that fit the goals, tasks and needs of the user. It strives to achieve optimal functioning of the overall human-machine system rather than information centered on sensors and technologies that produce it.

One of the first barriers that Endsley defines is an understanding of what user-centered design is not. This is going to be hard for many engineers, because over the years of developing HMIs and graphics, they learned that the operator often had more insight into what was a good graphic than what they were producing, therefore, they either left the design entirely to the operator user or sought operator input into what the graphic should look like. This sounds very reasonable, but has been found to be fraught with pitfalls and could be considered a poor practice.

Endsley brings out some important points that should be considered. The first is that operators often have only partial ideas about what might be better than what they are used to. They generally have very limited knowledge of how to present information effectively and design human interactions with complex systems.

The next is that these issues are compounded by the fact that most systems must be used by many different individuals, each of whom may have significantly different ideas on what they would like to see implemented in a new design. The result of this approach is an endless and costly cycle of implementing new ideas, only to have the next team of operators decide they want something different. Design solutions tend to be sporadic and inconsistent across features of the interface and many design problems are not recognized.

The best operator on the unit is often seen designing graphics and although much good thought has gone into the design, inconsistencies can be seen, such as poor use of colors, not reserving colors for coding or using colors for multiple codes, many more than most operators can memorize. Also poor layouts and cramped information are an attempt to get everything that could be of possible value onto the display. 

Endsley is not saying that the operators do not have valuable input. In fact she states that operators are a valuable source of input providing information regarding problems experienced, information and decision needs, working conditions and functions needed. But the unfiltered implementation of whatever they want completely neglects the large base of scientific literature regarding the types of interfaces that work and those that do not.

Page 3 of 4 1 | 2 | 3 | 4 View on one page
Share Print Reprints Permissions

What are your comments?

Join the discussion today. Login Here.

Comments

No one has commented on this page yet.

RSS feed for comments on this page | RSS feed for all comments