How many people die because of a manual override that does worse than the computer? How many people die because the computer itself did a mistake that a human could reasonably have overridden safely?
I fully expect the second number to be much lower than the first —eventually. And when it does, I'll be glad when manual driving is considered a lesser crime. Because make no mistake: risking your own life is your choice. Drive on tracks, smoke, drink, have unprotected sex (with an informed and willing partner), whatever. On the other hand, driving on public roads put everyone else at risk. A small risk, but still. I say that having such control over their lives is simply unethical.
> Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
Of course, never trust them farther tan you can throw them. But you can throw them pretty far: just sue their ass off whenever they're responsible for an otherwise avoidable accident (like a bug in the software caused by sloppy practices). Also, they can test the cars, and you can expect more consistency, compared to human drivers.
Those are good arguments for the automation existing to begin with, but don't address why manual controls shouldn't be available.
Which to me is the crux of the issue - in the scenario without an available override, when something goes wrong, there isn't anything you can do about it except hope that it doesn't kill you, because at that point it's no longer about the AI and all about collision physics.
How many people die because of a manual override that does worse than the computer? How many people die because the computer itself did a mistake that a human could reasonably have overridden safely?
I fully expect the second number to be much lower than the first —eventually. And when it does, I'll be glad when manual driving is considered a lesser crime. Because make no mistake: risking your own life is your choice. Drive on tracks, smoke, drink, have unprotected sex (with an informed and willing partner), whatever. On the other hand, driving on public roads put everyone else at risk. A small risk, but still. I say that having such control over their lives is simply unethical.
> Yet they seem to implicitly trust the result of those same humans building the network and infrastructure to do the driving for them.
Of course, never trust them farther tan you can throw them. But you can throw them pretty far: just sue their ass off whenever they're responsible for an otherwise avoidable accident (like a bug in the software caused by sloppy practices). Also, they can test the cars, and you can expect more consistency, compared to human drivers.