On Wednesday, 31 May 2017 at 13:04:52 UTC, Steven Schveighoffer
wrote:
This is like the equivalent of having a guard rail on a road
not only stop you from going off the cliff but proactively
disable your car afterwards to prevent you from more harm.
Sorry for double post, but - after thinking more about this - I
do not agree that this fits. I think a better analogy would be
this:
Your car has an autonomous driving system and an anti-collision
system and the anti-collision system detects that you are about
to hit an obstacle (let us say another car); as a result it
engages the breaks and shuts off the autonomous driving system.
It might be that the autonomous driving system was in the right
and the reason for the almost collision was another human driver
driving illegally, but it might also be that there is a bug in
the autonomous driving system. If the latter is the case, in this
one instance the anti-collision device detected the result of the
bug, but the next time it might be that the autonomous driving
system drives you off a cliff, which the anti-collision would not
help against.
So the only sane thing to do is shut the autonomous driving
system off, requiring human intervention to decide which of the
two was the case (and if it was the former, turn it on again).