One point I've never seen covered is liability in an at-fault accident, and this line at the end scares me a bit:
Quote:
|
or perhaps make no choice at all (just shutting down when encountered with a no win scenario to avoid additional liability)
|
If a self-driving car simply shuts off automation when it gets confused, the "driver" (operator?) is now in control, but pretty much guaranteed to crash. Does the liability for that crash go to the driver, who wasn't really controlling the thing, or the manufacturer, who is likely to be a bag of dicks to deal with?
And of those 1 million miles, what kind of driving was the car doing? And how well does this stuff work when bugs or snow and ice are covering the sensors? As much as I'm a fan of shitty drivers not driving, allowing them to drive less is just going to make them even shittier for the portion of the time that they do have to drive.