New Result on Distracted Driving

It’s Your Eyes, Not Your Brain

From the abstract of the original paper:

Drivers rarely focus exclusively on driving, even with the best of intentions. They are distracted by passengers, navigation systems, smartphones, and driver assistance systems. Driving itself requires performing simultaneous tasks, including lane keeping, looking for signs, and avoiding pedestrians. The dangers of multitasking while driving, and efforts to combat it, often focus on the distraction itself, rather than on how a distracting task can change what the driver can perceive. Critically, some distracting tasks require the driver to look away from the road, which forces the driver to use peripheral vision to detect driving-relevant events. As a consequence, both looking away and being distracted may degrade driving performance. To assess the relative contributions of these factors, we conducted a laboratory experiment in which we separately varied cognitive load and point of gaze. Subjects performed a visual 0-back or 1-back task at one of four fixation locations superimposed on a real-world driving video, while simultaneously monitoring for brake lights in their lane of travel. Subjects were able to detect brake lights in all conditions, but once the eccentricity of the brake lights increased, they responded more slowly and missed more braking events. However, our cognitive load manipulation had minimal effects on detection performance, reaction times, or miss rates for brake lights. These results suggest that, for tasks that require the driver to look off-road, the decrements observed may be due to the need to use peripheral vision to monitor the road, rather than due to the distraction itself.

Interesting implications for bridge and cockpit designs. Maybe HUDs were a better idea than we ever knew.



1 Like

Interesting data, but I suspect that boats & car “driving” may differ in some significant ways. In the car, our attention is almost always limited to visual range - and generally only the (unaided) visual range that reveals useful detail. Things like brake lights (and other warning lights), road detail and hazards like pedestrians/livestock/vehicles. On the boat, we have information coming to us from the instrumentation (plus magnified vision) that is at ranges that far exceed normal vehicular use. In my experience, this has led to some “uh-oh” moments when my focus wasn’t at the proper range - I was concentrating on a near hazard and a distant one appeared but didn’t get dealt with properly - or the reverse where I was “zoomed out” and missed something that only showed up on a “zoomed in” setting. These range errors seem to be very rare in vehicular situations - probably mostly because we don’t use instrumentation for hazard avoidance when driving our cars.

1 Like

There is the close range which can (mostly) only be done by visual and there is the far range which mostly can only done by instrument. Than there is the mid-range which can be done either way or better still by both.

In a high workload situation the issue is the cognitive effort and time needed to reload short-term memory. So for example if the conning officer has to leave the window and use the radar to determine the exact CPA of a vessel than effectiveness will be reduced.

On the other hand using an extra officer and BRM the conning officer can simply ask the officer manning the radar for the info required without losing the overall picture. If a HUD display had the CPA imposed over the vessel in question the additional officer would not be needed.

Sir, I think you’ve added another significant difference between ships and cars - a car/truck is pretty much always a single operator - additional “crew” generally add distraction, not help (don’t tell my wife I said that!). But a properly-functioning bridge (or aircraft cockpit) can have multiple operators - the “properly functioning” part is a skill set that requires training and experience to use, but which I suspect most have available.
In my boat case, I’m often a single operator - and that’s where the trouble starts. I might be “zoomed in” fixated on say, maintaining my position along the edge of the channel and miss a developing hazard a couple miles ahead - which then startles me as it appears on the screen. The opposite case is well exemplified by the case of the racing sailboat that hit the reef in the Indian Ocean a few years ago. It seems that such situations almost always occur at night or in limited visibility when our unaided visual acuity is low.
In sailing and flying, we’re always trying to stay ahead of the boat/aircraft, and sometimes it’s difficult to shift ranges back & forth - a difficulty not in common with most vehicle operations. Having multiple sets of eyes (and brains) is a real benefit.


I agree here, I think that the focus on a short range/long range is a key issue. But while adding more trained/qualified personnel is a solution deep-sea it has to be used very judiciously like any limited resource. Any aid that can reduce the requirement for double manning is going to help.

The Vestas grounding is a good example of the over-reliance on imperfect ECDIS or maybe just electronic chart display. “Zoomed out” = no islands on chart, “Zoomed in” = islands on chart.

The loss of the Aegean is an even more striking example.


I don’t see how either of these accidents were the result of distracted driving as described above. It’s worth noting that they both happened at night.
In the case of the Vestas, the crew was on deck and actively sailing when they hit the reef. It was an error on the part of the navigator. In the case of the Aegean, it appears that the lone crewmember left on deck fell asleep.

It’s not really, we took this off-topic a bit. With regards to the OP that had to do with peripheral vision which is an interesting topic, It was hit upon a bit here: A Mongrel of a Map?

A major role of peripheral vision, by comparison, is to monitor a much wider area, looking for regions that appear interesting or informative, in order to plan eye movements.

Given that it makes sense that something in you peripheral vision could “catch your eye” as that’s it’s function. As was mentioned driving is a mostly a simple task. If brake lights catch your eye it’s just a matter of looking up and taking the appropriate actions, like slamming on the brakes. At sea it takes a bit to regain the entire traffic picture when in heavy traffic.