Augmented Reality (AR) technology

MOL to install augmented reality (AR) technology on 21 of their VLCCs:

AR are already available in Singapore, where MOL have their operational HQ:

I personally think that it is a lot of changing information on that screen, not easy to look at, although the concept of visual support for the OOW is good. You normally are interested in the ships that can present a danger within let’s say the next 30 minutes, leave the rest out to avoid confusion and cerebral overload.

It would be an idea to add the possibility of providing advice about a new safe course or waypoint also something like decrease or increase speed to so many knots. That would be in line with the future autonomous way of sailing.

I watched the video and my head began to hurt from trying to take in all the information being displayed. I have to admit, that it seems very cool, but I wonder about the necessity. The reason it was created for jet fighters was that the speed things develop make it almost necessary. I don’t think the same is true of VLCCs.

There is a paper about that: Visualizing the Decision Space of a Ship’s
Maneuverability in a Real-Time 3-D Nautical Chart

  1. SUMMARY In an information design project at Malardalen University in Sweden a computer based 3-D chart system is designed based on human factors principles of more intuitive navigation in high speeds. This has earlier been presented at the IST-036/RWS-005 Massive Military Data Fusion and Visualisation workshop in Halden, Norway, 2002. See figure 1. In this paper, a way of visualizing the decision space of a ship’s maneuverability a few minutes into the future is suggested. Known methods for calculating a vessels future position based on knowledge of present position, direction, speed and acceleration as well as internal and external forces acting on the system is used. Such a visualization tool will be best put to use on huge ships of great mass such as for example large oil tankers.

Human cognitive shortcomings are evident not only when navigating in high speeds with limited decision times, but they may also become an issue with slow ships of great mass.

Is this some sort of fixed installation that only works from one position, or will you have people walking around the bridge wearing VR goggles, or ???

1 Like

Works from one position I believe. One problem that this system attempts to solve is the need to walk around to collect the information required (radar, ECDIS, window etc).

Basically both systems (the OP and the one I linked to) are using a bridge eye view to display rather than the traditional bird’s eye view that ECDIS and radar use.

The one in the OP appears to impose nav info onto a window and the one at the link uses what they are calling a “3-D Chart”

1 Like

I have the same issue but I think that is because the animation in the OP demonstration is sped way up. More like the speed of a fast ferry, including all the other traffic.

2 Likes

I’m an engine room guy & usually don’t post on navigation equipment & discussion but a researcher friend of mine showed me this cool video about the future of bridge navigation. He says some Japanese crude carriers already have this technology installed. Looks pretty cool to me & makes sense. I suggested maybe they should let the window flash red when another vessel is too close & perhaps have crewmembers with transponders on them so you can see their names on the screen/window when they are on deck? I can imagine this technology getting a foothold in the industry eventually.

If the link doesn’t work, look up “MOL Augmented Reality” on YouTube.

How are you supposed to process that constantly changing display?
It would do my head in very quickly.

He told me it incorporates the ECDIS & AIS. Hopefully it’s only installed on 1 window in each direction & has an on/off switch that doesn’t interfere with normal navigation when turned off.

If it’s not goggles/glasses it would no doubt be a free-standing installation or installations where you would stand/sit in a specific spot(s) to see it.

The article in the OP says the video comes from the bridge camera so I’d guess it’s a display like the ECDIS.

It would be quick to match up what’s seen out the window with the display, far quicker than matching up visual with radar/AIS/ECDIS display.

1 Like

Well if it’s just another, fancier screen to look at then that is too bad. I feel a lot safer when I think the guys up top are actually looking out the windows at their surroundings & I’m pretty sure those on nearby vessels feel the same. Not enough technology is seldom the root cause of accidents while being detracted is right up there at the top of the list of likely causes. If the augmented reality display could somehow be inside a bridge window that would be helpful IMO. If the ECDIS/depth sounder/AIS were in the line of sight of the horizon it would eliminate some of the biggest reasons watch standers have to divert their attention from the surrounding traffic. But again, I’m just a wrench turner & don’t know too much about it.

1 Like

I thought the problem wasn’t getting the information it was the decision making following that?

Ulstein Bridge Vision with head-up display on the windows were introduced already in 2012 and was presented on gCaptain:

They raised this question:

Has it???

I’ve read that when landing an aircraft in low-ceiling conditions; the pilot flying is flying by instruments while the pilot not flying (PNF) watches out the window. As soon as the PNF can see the airport environment (runway, markings, lights etc) the pilots switch roles.

The issue is the cognitive load and time required to switch.

In the wheelhouse the view-switching problem is between the bird’s-eye-view provided by traditional equipment (radar, ECDIS) the bridge eye view which is what is seen out the windows.

In high-workload situations it difficult to switch back and forth. Which is why many mates bury their head in the APRA or ECDIS.

It’s also why BRM is about one person watching visually while a second person provides additional info derived from instruments (contact speed or CPA) verbally.

See posts #3 and #7

Rolls Royce also proposed this four years ago. Their concept included audible feedback as well as augmented reality images.