We all try to understand why it’s not working and how to fix it. The “it” may be a process that was working, but is no longer is meeting expectations on quality, productivity or energy efficiency. The reason may be erosion in a pump impeller, fouling in a heat exchanger, deactivation of a catalyst, or such. And, if you can find the cause, you can fix it. The symptoms are the outcomes that don’t meet expectations. The cause is the reason it is not meeting expectations. To diagnose the cause, you need expectations of what “it” is supposed to be like, and understanding of how influences affect the outcome,
The “it” may be any number of things in our daily life: A bug in a computer program that causes occasional failure. An issue that is bothering a control system. A design of a new process. A strain in a formerly pleasant relationship. The reality is the application of a troubleshooting procedure is ubiquitous and frequent, and success with it means success in our aspirations.
Often, teams of folks brainstorm about possible causes, then seek to explore or change each possible cause. This is called a shotgun approach. It is inefficient, and, in implementing many unnecessary changes, the shotgun approach actually makes many small unnecessary upsets and increases variability. W. Edward Demming’s concept for statistical process control is to prevent such tampering and improve uniformity. Further, without a rational structure to a diagnosis process, the outcomes of many troubleshooting sessions often are appeasements to the boss’s opinion, resurrection of past actions that were done, or appropriations of the situation for personal political reasons.
A medical doctor uses the patient’s symptoms to diagnose the disease, then provides a protocol to cure that particular cause. Usually, there are many diseases that could lead to patient-identified symptoms, and the doctor follows a protocol to ask about other possible symptoms or to run a few tests. Then it is possible to discriminate among possible causes and identify the most probable one cause. The doctor does not prescribe 10 different treatments for each of 10 hypothesized causes, which is the shotgun approach.
Analytical troubleshooting, rational diagnosis, rational decision-making, problem solving and critical thinking are all terms given to structured methodologies to diagnose the cause of an undesired outcome. With so many names, it must be important. I think it is, but never learned it in school. My company provided an internal training course based on the Kepner and Tregoe book, “The Rational Manager: A Systematic Approach to Problem Solving and Decision-Making,” to reveal structured diagnosis techniques. For instance, answers to “where it is and where it is not” and “when it is and when it is not” or “if that is the situation, what else would it express” or “interval halving search” can help direct the generation of hypothesized causes and provide the data to reject wholly inconsistent hypothesized causes. It was one of the most empowering courses of my life.
While I can recommend the K&T book, there seem to be a wide number of offerings that may also provide effective troubleshooting guides. Two that also seem recommendable are “Root Cause Analysis: The Core of Problem Solving and Corrective Action” by Duke Okes, and “The Rational Project Manager: A Thinking Team's Guide to Getting Work Done” by Andrew Longman and Jim Mullins. I am sure that there are many other favorites.
Systematic diagnosis and problem identification are essential skills for both life and engineering. There is problem solving at the graduate research level, but instead of being a formalized process, it is intuitively and randomly led. At the undergraduate level we teach the right way, and as soon as students almost get it, we move them to the next topic. Rarely is there any training in systematic diagnosis and decision-making within engineering course topics. As a result, faculty were never trained in rational troubleshooting, cannot relay it to their students and propagate this void in a critical engineering skill. You, therefore, need to pick it up on the job.
My industrial course about a rational structure toward effective situation analysis and good decision-making was an eye-opener for me. And, once understood by the professionals in the location, it became an important and common approach to enable teams to better target corrective action. Similarly, HAZOP, HAZAN, SIS, and ISO-9000 procedures are widely accepted, useful and beneficial, and provide collective acceptance.
However, by imposing a structure, we often disable creativity and intuitive leaps. In his book, “Streetlights and Shadows – Searching for the Keys to Adaptive Decision Making,” Gary Klein provides much advice on how to temper the restraints of such structured thinking. He reveals how protocol can restrict a mental process to tunnel vision and actually reduce expertise as it shifts focus from the item to the structured procedure.
I am a proponent of rational thinking processes and the use of guides to design and diagnosis, but also know that structure can lead to trivialization and misdirection. Klein raises the case for experience and tacit knowledge to complement structures. His book does not discount the use of rational guides. Instead, it argues for the need to infuse the process with tacit knowledge, expertise and perception. He provides guidance to break free of mental models (fixation), to unlearn and relearn how it works. I think it is a good book about understanding thinking, problem solving and design, and a useful complement to the books about structured troubleshooting. I appreciate Stephen Mayo for introducing it to me.