Future of ships


You just can’t seem to help yourself to take a jab at the U.S. and bring your POV of politics into just about every post you make. Maybe you need to see someone about it!


More about the maritime fuel mix of the future in Splash 24/7 today:


The videos of the self-driving car fatality have been released. According to the reports, the “safety driver” was not looking at the road, the victim did not dart in front of the vehicle, and the system had two seconds to detect the unfortunate woman:


I think a team of young engineers are now undergoing a maturing experience, both technically and emotionally.

Given the haste with which these systems have been developed, I am not optimistic that a definitive cause and fix will be discovered.



You Sir should see someone about your sense of humor. (Or lack there off)


The penalty for J walking at night unlit is death, as you would expect.
6000 pedestrians died in 2017 in the USA without the help of AV.

I can think of lots of ship accidents that most likely wouldnt have happened if you took control away from the humans.


Avoiding the pedestrian in this case should have been 100% within the advertised capabilities of these cars. Presumably if the detection system had a fault it would have given an error of some kind.

At this point this makes it much more likely that the claims made about these cars is mostly hype. That should be taken into account when evaluating these claims in general.

Video suggests huge problems with Uber’s driverless car program

conventional car crashes killed 37,461 in the United States in 2016, which works out to 1.18 deaths per 100 million miles driven. Uber announced that it had driven 2 million miles by December 2017 and is probably up to around 3 million miles today. If you do the math, that means that Uber’s cars have killed people at roughly 25 times the rate of a typical human-driven car in the United States.


You beat me to it – anyone interested in the direction technology is going these days should read that article. This quote says all that needs to be said about the startup/Silicon Valley attitude toward high-consequence engineering:

“We don’t need redundant brakes & steering or a fancy new car; we need better software,” wrote engineer Anthony Levandowski to Alphabet CEO Larry Page in January 2016. “To get to that better software faster we should deploy the first 1000 cars asap. I don’t understand why we are not doing that. Part of our team seems to be afraid to ship.” In another email, he wrote that “the team is not moving fast enough due to a combination of risk aversion and lack of urgency.”

So this clown’s idea of engineering discipline is to turn 1000 admittedly inadequate prototypes loose and see what happens. Agile software development meets kinetic energy; agile development loses.



You can’t do statistics that mean anything with one fatality. IMO that whole paragraph is junk.

OTOH, there’s something drastically wrong on the face of it if the Uber vehicle can run into a slow-moving pedestrian who had already crossed a lane and a half of unobstructed road. This wasn’t a matter of someone leaping from concealment into the vehicle’s path.


What this really shows is that if you give humans too many toys, the humans fall asleep at the switch.


This is Uber’s fault. It never would have happened if they’d hired Norwegian software engineers


Sounds like a car has never been released on the road with a fault that killed people and needed a recall.
We must be in a whole new realm now that that will start happening.


bit like the increasing number of ECDIS assisted crashes and groundings, time to get he electronics off the ships or the operators…

You notice that most plane crashes these days are pilots crashing perfectly good planes, time they got them out of the cockpit as they just no longer have the skills required.
( the grey haired ones still do)


Human Driver Could Have Avoided Fatal Uber Crash, Experts Say

Zachary Moore, a senior forensic engineer at Wexco International Corp. who has reconstructed vehicle accidents and other incidents for more than a decade, analyzed the video footage and concluded that a typical driver on a dry asphalt road would have perceived, reacted, and activated their brakes in time to stop about eight feet short of Herzberg.


Doesn’t matter, it’s just collateral damage. There’s an urgent need to get rid of humans in order to increase profits.


Of course there have been cars shipped with fatal flaws, but it has been years since people have put cars on the road with fundamental aspects as severely under-engineered as these.

It’s not software that’s the problem, it’s software that’s produced by the Silicon Valley “crap in a hurry*” business model that’s the problem. I was on the verification team for the Saab JA-37B autopilot, the first full-authority digital fly-by-wire system to go operational. I was told by a Saab engineer that they flew for eleven years without an inflight emergency. I can give you any number of other examples.

This woman was not killed by software, she was killed by a corporation called Uber. Google on them. They are a crap company led (up to recently) by a crap CEO and populated by crap engineering management, as the quote I cited shows. Could others do better? Maybe, but definitely not anywhere near in the timeframes the hype artists are promising, and definitely not without massive upgrades to infrastructure to overcome the inherent limitations of sensors and sensor fusion.


  • The “crap in a hurry” remark comes from the comedian Rick Hall. This will only make sense to people who were in America in the 80’s, but here goes:

“I used to wonder what Americans wanted. Then I noticed that the three highest-gaining stocks this year were Home Shopping Network, Federal Express, and Domino’s Pizza. Now I know what Americans want. They want crap in a hurry.”

Home Shopping Network was a cable network where you could phone in and buy shoddy merchandise to be delivered, Federal Express is now FedEx, and Domino’s Pizza was a chain the promised a delivered pizza in 15 minutes or it was free, a promise dropped after a string of accidents.

In the Silicon Valley business model, being first to market and establishing market domination is fundamental. Quality comes second. Hence, “crap in a hurry.” This is what made Bill Gates rich. All very good, until you add kinetic energy to the mix.


And in any case it should have been a slam dunk for a Lidar-equipped vehicle that sees equally well in the dark as in the day.


I agree that it would be a mistake to place much confidence in the 25x rate they have calculated.

Beyond that, I don’t understand statistics very well but if you flip it, U(ber) driver = 3 million miles, H(uman) driver = 100 million miles doesn’t that reduce the probability that the U driver will have a lower rate then the H driver?

For example if I build some car driving software and it kills someone in the first 10 feet do I have to go another 100 million miles before I can calculate some probability of it being safer or not?


Yes, you do. Or rather, you have to wait until more people are killed, however long that takes. How many more, you’d have to ask someone who understands statistics better. But by its nature it’s about large numbers.

Of course if you can discover a failure – and this appears to be a gross failure – then you don’t have to wait around for statistics to tell you there’s a problem.


That’s what I’m not sure about. I understand a large sample size is needed to get an accurate rate. But to answer the question how safe is Uber compared to human drivers, is it 100 times safer? 10 x safer?, 5 x ? twice as safe or about the same? It seems that given this one accident the probability of it being 100 times safer is less then before.

Seems we should reduce our Bayesian priors given this new information.


Open the box and watch the bats fly out: