Think what you will of Donald Rumsfeld, former U.S. secretary of defense under George W. Bush, but his comment in 2002 (about weapons of mass destruction in Iraq) remains one of the most memorable of modern times: "Reports that say that something hasn't happened are always interesting to me because, as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns—the ones we don't know we don't know."
Rumsfeld didn't coin the phrase—it was used by NASA and others in the 1970s and 1980s to talk about assessing the risks in aerospace and uranium mining—and it serves to remind us not only that we don't know everything, but that how we perceive our ignorance is colored by our expectations, biases and prejudice.
Recent focus here in Chicago on police car dash camera footage showing extreme use of force by officers on apparently non-threatening offenders has raised a lot of questions about how that crime prevention and control workforce goes about its business. But it also shows the power of information technology—the camera footage—to show us things we didn't know or might have not been willing to admit were going on. And over the next few months, Chicago will find out how well it's able to use that information to improve the quality and efficiency of the policing process.
Similarly, applying information technology in a connected world is explosively increasing our ability to collect and analyze data. But will we be willing and able to use it effectively? I can't remember how many times I've analyzed a process or quality problem, even using design of experiments methods, and ended up chasing my tail because it looked like one or another set of theories appeared to be supported when, in fact, the results were confounded by factors I hadn't thought of, or wasn't aware of, or was kept in the dark about.
I might run one batch on day shift and the other on nights, assuming both crews (and the humidity and the traffic on the railroad tracks just outside the plant, etc.) were not factors I needed to control. At night, even the velocity of natural gas through the cruddy mains is higher—could we have picked up a contaminant? What if another furnace on another job in an adjacent building was running—or not? How fast was the flow of the Argon quench to the vacuum furnace? How cool was the cooling water to the heat exchanger?
The information promised by the Industrial Internet of Things (IIoT), especially combined with regular IoT data from the outside world, suggests the potential for great strides in productivity, quality, energy efficiency, reliability—all the levers we can use to transform plants, reduce drudgery and make processed goods even better, more plentiful and inexpensive. Sophisticated software and machine intelligence will be able to show us every possible opportunity, if we are willing and able to look.
But experience suggests that few of us will look beyond our prejudices—our preconceived notions of what matters, what doesn't and what we can do about it. If the data suggests our people—our engineers, operators and maintenance technicians—need to do something different, will we accept that responsibility? Will we make human changes by example, education and reward? Or will we try to punish, or simply discard, the folks who no longer fit into the model?
It seems so obvious and simple to understand that knowledge is gained by information, information relies on facts, and facts are discovered by analyzing data. But to gain the highest benefits of that knowledge, we have to open our minds to new realities and ideas that may contradict what we always held as absolute truths.