Thanks to Twitter, again, I recently came across one of the most insightful articles I’ve read in a long time—on a safety blog, no less. Steven Shorrock takes on the concept of “human error” and adds considerably more perspective to this oft-used term.
“In the aftermath of the [Spain rail] accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism,” Shorrock writes. “That appeared to leave only two possible explanations—‘human error’ or ‘recklessness,’ or both. When society demands someone to blame, the difference—whatever it might be—can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame.”
When something bad happens—an accident like this, an inappropriate comment, a disastrous roll out of a website—we seek action, fast. We want accountability, and usually by that we mean rolling heads. Rarely do we probe further into the context of the event, and especially into the system. As just one example:
…
Comments
Open Systems
Hello Kevin:
Terrific article. I hope readers see that it is possible to inappropriately apply the concept of blaming human error to any "error" in a system, safety or otherwise.
Please will you explain what the following means? "Taking one example, without expectation, radio-telephony would be very inefficient."
Since all system are inevitably open, variability is always exist. We can create momentarily closed systems, but I am not aware of a system that is always or permanently a closed system. Variation is inevitable.
Finally, I think I heard the following quote from Russell Ackoff. We see what we thought before we looked. This is another way of saying that we hear what we expect to hear.
Thank you, Dirk
Myron Tribus
It was Myron Tribus that said we see what we thought before we looked. It was not Russ Ackoff.
An old story
It's an old story that blaming must come before understanding; to the point that non-teleological approaches are mostly unknown. And if we look at many, many FMEA's records, we find that "human error" takes the lion's share of failure causes. It is however to be understood whose "human" is the error: the operator, or who's above him, selected him, trained him, controls him?
Add new comment