First, there have been very few deaths involving Teslas. So, most people *do* understand and take seriously that autopilot is not self driving.

But, realistically, you shouldn't need any warning to take over autopilot should it cut out or fail.

The one exception I can think of, and I don't know if this has occurred, is if you are driving down a safe, straight stretch of road and briefly looking at scenery or a passenger. Or similarly, looking for an address while driving in a city (you have to take your eyes off the road for a moment). You would expect the car to continue going straight. In that case, if autopilot caused the car to veer, I would say autopilot is at fault.

But while going around a curve, handling a construction deviation, changing lanes, etc., I think you really should be 100% paying attention.

Peri

<< Annoyed by leaf blowers ? https://quietcleanseattle.org/ >>

------ Original Message ------
From: "Jay Summet via EV" <ev@lists.evdl.org>
To: ev@lists.evdl.org
Cc: "Jay Summet" <j...@summet.com>
Sent: 10-Jul-21 5:02:01 AM
Subject: Re: [EVDL] AutoPilot drops out on tight road turns.



On 7/9/21 11:55 PM, Peri Hartman via EV wrote:

But, honestly, if the driver is paying attention and autopilot shuts down, 
what's to lose ? Up to that point, either autopilot was driving the car 
correctly, in which case the driver should be able to continue just fine 
without it. Or, autopilot was not driving correctly in which case the driver 
should have already taken over. I am having a hard time to imagine a situation 
where Tesla or autopilot can be blamed for a crash unless it actually prevented 
the driver from taking over, and that has never happened as far as I know.

Yes, the driver is still responsible, but the big issue is that Tesla advertises it as an 
"auto-pilot" or "full self driving (beta)" that is designed to (eventually) 
drive your care with no required supervision from the driver.

Every other car manufacturer calls it "lane assist" or "intelligent cruse control" or 
"freeway driving assistant", which more accurately gives the user the impression that it's a 
fallible assistant that needs to be directly supervised, and not something that you would even consider tying 
a weight to your steering wheel and watching a movie with.

This leads to people using it (and trusting it?) in dangerous situations where 
if it cuts out things go wrong very quickly.

See for example:
https://www.youtube.com/watch?v=8ATJaVTpviQ



There is also an issue of how much time a driver needs to recognize that the auto-pilot 
has failed and take control. In the video above, it starts to "Bing" less than 
a second before they are crashed into the trees.

Jay
_______________________________________________
Address messages to ev@lists.evdl.org
No other addresses in TO and CC fields
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
ARCHIVE: http://www.evdl.org/archive/
LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org

_______________________________________________
Address messages to ev@lists.evdl.org
No other addresses in TO and CC fields
UNSUBSCRIBE: http://www.evdl.org/help/index.html#usub
ARCHIVE: http://www.evdl.org/archive/
LIST INFO: http://lists.evdl.org/listinfo.cgi/ev-evdl.org

Reply via email to