How Complex Systems Fail - Richard Cook, MD -

How Complex Systems Fail (pdf)

(Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is
Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety)
Richard I. Cook, MD

Very Good paper - 18 short points

1 Like

Indeed, a good paper. Some aspect of points 7 and 11 were relavent in several unfortunate incidents I’ve been involved with.

Point 7 - ascribing a single root cause in some incidents is an impossibility. Too many variables, ie. several, and in some instances a dozen or more human elements over a protracted period of time are often contributory to an incident.

Point 11 - speaks to the control of costs involved in work and getting work done. After an incident occurs in the marine world, the practictioner (mariner in the hot seat after an incident) either truly made a mistake, or took a risk doing something with unintended consequences, or was the at the receiving end of pressure from someone to get something accomplished, also with unintended consequences. In any case, it’s the practitioner who’s usually held responsible for the incident.

I imagine every person with seagoing experience and reading this thread has witnessed the result of points 7 and 11 especially, and perhaps one or more of the other 16 points itemized in Dr. Cook’s paper.

1 Like

Yes, with point 11 I’ve been calling this a “rule-based” analysis. One example was the El Faro and the so-called 1-2-3 rule for hurricane avoidance. For a lot commenters pointed out that this rule was violated as if that was the final word. But in the real world almost everyone working an area under threat violates this rule. So it’s not meaningful to point it out

Same thing with COLREGS. Almost nobody is 100% in compliance 100% of the time. In practice the COLREGs have far more ambiguity then an analysis, ambiguity which after an incident hindsight bias strips away.

11 Actions at the sharp end resolve all ambiguity.

Organizations are ambiguous, often intentionally, about the relationship between
production targets, efficient use of resources, economy and costs of operations, and
acceptable risks of low and high consequence accidents. All ambiguity is resolved by
actions of practitioners at the sharp end of the system. After an accident, practitioner
actions may be regarded as ‘errors’ or ‘violations’ but these evaluations are heavily
biased by hindsight and ignore the other driving forces, especially production pressure.

I should reread this article before I comment on any incident in the future.

1 Like