A scroll through these posts, which are the ones that mention Leveson will give an overview.
Deepwater Horizon incident is discussed.
A draft copy of Nancy Leveson’s book is available free on-line:
https://static1.squarespace.com/static/53b78765e4b0949940758017/t/57d87eb6d2b8571af3501b26/1473898764674/Engineering_a_Safer_World+Nancy+Leveson.pdf

Traditional risk and failure analysis focuses on specific pathways that lead to accidents, identifying potential points of failure and the singular “causes” of the accident (most commonly including operator error). Leveson believes that this approach is no longer helpful. Instead she argues for what she calls a “new accident model” – a better and more comprehensive way of analyzing the possibilities of accident scenarios and the causes of actual accidents.
The longstanding and widespread tradition of the person approach focuses on the unsafe acts—errors and procedural violations—of people at the sharp end: nurses, physicians, surgeons, anaesthetists, pharmacists, and the like. It views these unsafe acts as arising primarily from aberrant mental processes such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness. Naturally enough, the associated countermeasures are directed mainly at reducing unwanted variability in human behaviour. These methods include poster campaigns that appeal to people’s sense of fear, writing another procedure (or adding to existing ones), disciplinary measures, threat of litigation, retraining, naming, blaming, and shaming. Followers of this approach tend to treat errors as moral issues, assuming that bad things happen to bad people—what psychologists have called the just world hypothesis.1
The link above is book length, one way to skim it quickly is to ctrl-F through using the terms in this diagram.

Captain as an independent actor vs. part of a system.
IMHO both approaches have merit, using just one misses things.
Maybe, but isn’t it obvious what the captain did wrong? It seems like it’s obvious to mariners and non-mariners alike.
Just focusing on what the captain did wrong is worse the useless without understanding hindsight bias.
VERY obvious, but he worked in a system that made him think it wasn’t wrong. He may have never been on a boat with a routine anchor watch in his entire life. As has been mentioned, some dive boats have passengers aboard without any crew at all, not even sound asleep crew.
A bit about hindsight bias.

1 Like
All these points are valid but ignore the fact that personal accountability plays a role in decision making.
In real life; good, conscious marines that are motivated to do a good job almost never follow the procedures, guidelines etc exactly as written.
Over time as workers get more experienced at their jobs they use their own judgement as to which written requirements / procedures need to be followed exactly and when shortcuts can taken or which steps can be skipped.
Case in point: I’m the guy writing the procedures at work, and I still don’t follow them to the letter. Sure, I’m supposed to update the written guidelines to converge with actual procedures, but this is the natural way of things. I do have a zero tolerance policy on other people gundecking; if that becomes necessary, I’ll do it myself, and then hopefully get forced to make the necessary changes.
I recall when the company first sent out the SMS binders to the ship. Two 4 in thick binders, some sensible stuff, some CYA nonsense.
Next trip I had a young ex-fisherman as third mate. He’d taken a look and couldn’t make sense of it so he asked me what it was all about.
I told him the books were in two parts: if he failed to follow the good half of the instructions he’d be fired and if he did follow the bad half he’d be fired.
So the third mate says; “OK so which half is which?” I told him, that’s the thing, nobody knows.
3 Likes