On 7/9/21 11:55 PM, Peri Hartman via EV wrote:

But, honestly, if the driver is paying attention and autopilot shuts down, what's to lose ? Up to that point, either autopilot was driving the car correctly, in which case the driver should be able to continue just fine without it. Or, autopilot was not driving correctly in which case the driver should have already taken over. I am having a hard time to imagine a situation where Tesla or autopilot can be blamed for a crash unless it actually prevented the driver from taking over, and that has never happened as far as I know.

Yes, the driver is still responsible, but the big issue is that Tesla advertises it as an "auto-pilot" or "full self driving (beta)" that is designed to (eventually) drive your care with no required supervision from the driver.

Every other car manufacturer calls it "lane assist" or "intelligent cruse control" or "freeway driving assistant", which more accurately gives the user the impression that it's a fallible assistant that needs to be directly supervised, and not something that you would even consider tying a weight to your steering wheel and watching a movie with.

This leads to people using it (and trusting it?) in dangerous situations where if it cuts out things go wrong very quickly.

See for example:
https://www.youtube.com/watch?v=8ATJaVTpviQ



There is also an issue of how much time a driver needs to recognize that the auto-pilot has failed and take control. In the video above, it starts to "Bing" less than a second before they are crashed into the trees.

Jay
_______________________________________________
Address messages to ev@lists.evdl.org
No other addresses in TO and CC fields
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
ARCHIVE: http://www.evdl.org/archive/
LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org

Reply via email to