The average driver breaks multiple laws on every trip. Most of the time no one gets hurt. But calibrating performance against folks violating traffic and criminal laws sets the bar too low for an automated system. We should be aiming for standards that either match European safety levels or the safety of modes of air travel or rail travel.
Except that doesn't work if you're trying to produce a safe product. Investigations into crashes in the airline industry have proven that removing pilots from active participation in the control loop of the airplane results in distraction and an increased response time when an abnormal situation occurs. Learning how to deal with this is part of pilots' training, plus they have a co-pilot to keep an eye on things and back them up.
An imperfect self driving vehicle is the worst of all worlds: they lull the driver into the perception that the vehicle is safe while not being able to handle abnormal situations. The fact that there are multiple crashes on the record where Telsas have driven into stationary trucks and obstacles on roads is pretty damning proof that drivers can't always react in the time required when an imperfect self driving system is in use. They're not intrinsically safe.
At the very least drivers should be required additional training to operate these systems. Like pilots, drivers need to be taught how to recognize when things go awry and react to possible failures. Anything less is not rooted in safety culture, and it's good to see there are at least a few people starting to shine the light on how these systems are being implemented from a safety perspective.
> Perfect is the enemy of good, and rejecting a better system because it isn't perfect seems like an absurd choice.
Nothing absurd about thinking a system which has parity with the average human driver is too risky to buy unless you consider yourself to be below average at driving. (As it is, most people consider themselves to be better than average drivers, and some of them are even right!) The accident statistics that comprise the "average human accident rate" are also disproportionately caused by humans you'd try to discourage from driving in those circumstances...
Another very obvious problem is that an automated system which kills at the same rate per mile as an average human drivers will tend to be driven a lot more because no effort (and probably replace better-than-average commercial drivers long before teenagers and occasional-but-disproportionately-deadly drivers can afford it).
Yes, I agree. We should hold automated systems to a higher standard. Unless you’re proposing we ban automated systems until they’re effectively perfect because that would perversely result in a worse outcome: being stuck with unassisted driving forever.