1660316692438 Systemnessthinkingallowsforsaferdesignshero

Systemness thinking allows safer designs

May 2, 2022
aeSolutions demonstrates how people can close safety gaps and practice safety methods
More on process safety

This article is part of a series on process safety. Check out the rest of the series here.

While most quests for process safety focus on removing humans and their perceived errors from operations, some recent research indicates people can greatly contribute to improving safety if the processes, systems and environments they're thrown into are better designed.

"There are many new and digitalized technologies, as well as artificial intelligence (AI) tools, going into chemical and other plants. But, even with all these new technologies, these organizations and their facilities can't get away from the impact of people and the human element," says Dave Grattan, process safety engineer at aeSolutions, a consulting, engineering firm in Greenville, S.C., and a member of the Control System Integrators Association (CSIA). "The new 'safety differently' idea that flips this script views humans as a solution for closing safety gaps created by hardware, operations and automation. This systemness thinking argues that people have to work within certain system constraints, and take actions that make sense at the time, but may not make sense to outside observers. The new book, There Are No Accidents by Jessie Singer, reports that people achieve results that are already baked into the systems where they work. The main idea of systemness is to go deeper than the usual human-error blame game. Sometimes different people will make the same errors in the same system, and that's when its design, interfaces and normal work variations need to be addressed, and the context for the input needs to be broken down and improved."

Grattan explains that aeSolutions is trying to incorporate systemness thinking in its projects, along with the human-factors task analyses and human reliability analyses it already employs, which are used to resolve safety issues before they cause incidents. For instance, it applies incident investigation tools proactively in railcar unloading applications to avoid potential loss of containment and hazardous releases.

"We study the site and analyze staff tasks, identify issues error-promoting or error-likely situations, and design and install solutions before accidents can occur," explains Grattan. "These efforts include automating operations, and evaluating alarms and procedures. We also look at operators' risk perceptions, variations in their experience levels, and operating environment conditions. For example, if they're running digital procedures on tablet PCs, we also check the network's robustness, and ask what they plan to fall back on if needed? These studies can find many opportunities that are easy to address, such as performance weaknesses during one plant's turnarounds, which were due to less-experienced quality inspectors finding fewer quality issues during their night shifts. In another unloading application at a very large site, we found that trucks were getting lost and unloading manually, which risked inadvertent mixing and hazardous reactions in tanks. In was suggested that locks and tags be added, but that can create constraint issues. A better solution was to reduce the chance that a truck would get lost, and provide certified unloaders at all units."

Grattan reports that most end users and other plant-floor personnel want to run their equipment, processes and facilities properly, but some operations can still get out of hand. To prevent these lapses and potential incidents, he adds there are several methods that can help them operate more safely. They include:

  • Petro-HRA method for qualitative and quantitative assessment of human reliability in the petroleum industry. It allows systematic identification, modeling and assessment of tasks that affect major accident risks.
  • Action Error Analysis (AEA) analyzes interactions between machine and humans, usually in petrochemical applications, but also in flight operations, air traffic management (ATM) and aviation safety. It's used to study the consequences of potential human errors in task execution related to directing automated functions. AEA is similar to failure mode effects analysis (FMEA), but is applied to steps in human procedures, rather than components or parts.
  • Systems Theoretic Accident Model and Processes (STAMP) is founded on basic systems theory concepts, and provides a theoretical foundation for introducing new types of accident analysis, hazard analysis and accident-prevention strategies including new approaches to designing for safety, risk assessment techniques, and approaches to designing performance monitoring and safety metrics.

"STAMP is an accident-causality model, so users can feed data in, get better insights on causality, and predict weaknesses more effectively," says Grattan. "All of the methods are easier to employ than previous tools, so it's more likely they'll get used and prevent more problems and accidents."

About the author: Jim Montague
About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

IEC 62443 4-1 Cyber Certification – Why ML 3 is So Important

The IEC 62443 Security for Industrial Automation and Control Systems - Part 4-1: Secure Product Development Lifecycle Requirements help increase resilience for control systems...

Multi-Server SCADA Maintenance Made Easy

See how the intuitive VTScada Services Page ensures your multi-server SCADA application remains operational and resilient, even when performing regular server maintenance.

Your Industrial Historical Database Should be Designed for SCADA

VTScada's Chief Software Architect discusses how VTScada's purpose-built SCADA historian has created a paradigm shift in industry expectations for industrial redundancy and performance...

Linux and SCADA – What You May Not Have Considered

There’s a lot to keep in mind when considering the Linux® Operating System for critical SCADA systems. See how the Linux security model compares to Windows® and Mac OS®.