This article was from August 2019 shortly after NTSB report on McCain came out. Who knows why the great algorithm in the ether brought it up to me now but it did seem to jostle loose some subterranean thoughts on automation.
So is the navy throwing out the baby with the bath water or making a meaningful improvement? It’s hard to tell from reporting like this. But the interesting part for me related to HMI’s.
Easy enough to agree with this:
“This is a classic case of information overload and a poorly thought out Human Machine Interface,”
I’m assuming by “mechanical controls” they are referring to the IO devices only and I hope so. Automation people can run away with themselves if you let them. I think assuming “young sailors” will naturally find touch screen control better or more efficient is not warranted in every context. That may not be the right choice no matter who you are.
Making critical manual input to an automation system via wheels, levers and big knobby things seems intuitive no matter what your age is and (speaking from an ECR perspective) achieves the end result with minimum of distraction from the OODA LOOP while executing an action. Example silence an alarm.
Noise.
Glance at top row of a screen to see what the current alarm is and
Find pointing device (if not touch screen)
Move cursor to silence button on screen.
Click
Navigate to alarm page or look at a display you may have set up to be left on the alarm page.
Check if the alarm was part of a sequence and ORIENT / DECIDE
An improvement?
Add manual silence buttons to the console, we had 5 or 6 over the length of the console. Easier to silence the audible notification and maintain focus on the alarm information, plant information and next steps.
Power Management is great and starting a DG set for routine DG change over is not too burdensome to navigate to a screen click a few buttons, respond to pop-ups etc BUT what about if DP calls and says we need one now? We put in a big switch on the console “start next engine” within reach of the phone.
These are trivial things for automation to handle but developers can get too enamored with their own toolbox/feature set. It seems to me the first version out to crew is usually deficient in these simple things.
If your console is only a display or even multiple displays I think you are making it harder to operate. Certain key parameters should still be meters in a fixed position on the console and simple indication lights can easily be left out of the design which make it easier to OBSERVE and ORIENT.
Wonder if there are studies that show by dropping your head down, limiting your field of focus / attention to a few square inches of screen at a time perhaps involving a sequence of eye movements / physical actions (cursor navigation / button clicking) actually slows down problem solving sequence of thoughts?
I’ve come to appreciate what I will call “reliable remote control” and even in automated unattended plants never design out those options in the software.
Hope the navy asks a few old plant operators to contribute to the next generation of HMI they have in mind. Ideally the functional description of the system would go beyond “must be able to make load dependent starts of DG’s” and include more on the HMI side.