By Walt Boyes, Editor in Chief
We have been trying very hard to make workplace safety, process safety and cyber security actually work in our plants. Well, at least some of us have been trying very hard. Others not so much. In a recent study of safety professionals done by Kimberly-Clark, 89% said they had observed workers not wearing their protective gear when they should have been, and 29% said they'd seen it happen frequently.
The record for process safety is equally dismal. From the Deepwater Horizon disaster to the multiple incidents at Formosa Plastics and other refining and chemical plants throughout the world, from this vantage point, it sure looks like we just don't get it.
The yawn from industry CEOs to the cybersecurity dangers from follow-on Stuxnet attacks, the repeated disasters, such as the fact that it took the ExxonMobil RTO (Real-Time Operations Center) in Houston almost an hour to shut the isolation valves after the Yellowstone pipeline rupture, and the SCADA-related findings released at Black Hat, makes me wonder why we're fiddling while the infrastructure catches fire.
There are several reasons. Some are, obviously, financial. But much scarier is the other reason: the way human beings react to threats.
The financial reasons to try to ignore safety and security issues are pretty clear. The costs of making plants safer and more secure are substantial, and they all impact the profitability of the enterprise negatively. There are also risk-management issues, but there is no clear and consistent effort by corporate risk managers and their insurance auditors to make mitigating safety and security threats a very high priority. There's no way to include "protected the plant from security and safety threats" on the financial roll-up.
The problem with this widespread attitude, however, is that even in the face of logic and data, both management and employees try to skate on safety and security issues all the time. Why?
We humans appear to engage regularly in a sort of magical thinking about danger and threats. This kind of magical thinking was clearly in the driver's seat on the day the Deepwater Horizon sank. The two top executives from BP's Gulf of Mexico drilling and exploration division were actually on board when the vessel started to explode. They were there to present an award to the employees for being the safest rig in the fleet—seven years without a lost-time accident. That fact was used overtly prior to the sinking and the disaster to not only excuse, but actually permit violations of safety protocols that were directly responsible for both the accident and the lack of response to it.
If the danger is real and imminent, and the threat is life-or-death, people react with stunning capability. But the further away the threat is perceived, the more unreal it becomes. We find ourselves discounting the seriousness of any threat if it doesn't affect us personally and immediately. And we are clearly doing that for both safety and security issues. We perceive the threat to be less serious than it actually is. It's a perception issue.
It could be that we are hardwired to think this way as a reaction to the fact that the world is and always has been a fairly scary and dangerous place.
But if that's true, we may have come to the limit of what we can expect corporations and government to do about workplace and process safety and cybersecurity. Big accidents like Deepwater Horizon, BP Texas City and the Iranian Uranium Enrichment Facility may actually be necessary to force us to treat these threats as imminent danger and do something about them.
That may be the scariest thought of all.