Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's a marketing parlor trick. They take legal responsibility for something that cannot happen and immediately drop to human control when things turn south, like all so called "Level 3" systems.

I'm also sure they will not take any responsibility if someone rear-ends you when the car stops in confusion on the highway.



Did you read the article? They guarantee a 10 seconds manual take over time and remain responsible in those 10 seconds. That's not the same as the instant dropping the ball that Tesla, Volvo, GM etc all do.


Their system (confirming to German law and as mentioned in the article) gives you a 10 second takeover window within which they are still liable.

Yes they do. That's the point of their announcement and the reason why they only allow it under very specific and favourable conditions


This is remarkable, twice over.

First, a software vendor accepting responsibility for the software's actions? Wow.

Second, they're confident of being able to predict accidents ten seconds in advance? That's up to 160m away and I think that's great, even if they limit the circumstances sharply and allow many false positives.


I don't think they're predicting exactly; I think they've decided that's what is reasonable to ask of a consumer, and they're building their Ts & Cs to fit. They'll then make the technology fit as best it can, but if they can't, it's their fault.


This is not "a software vendor accepting responsibility for the software's actions" this is a car, placed between human, unpredictable, drivers operating 2 tonne machines. Far from Photoshop working on your PC or an embedded system for your fridge.


So it should be much easier to get a warrant of fitness for your fridge, if not the photo-editor, right?


No, they are confident that they can detect at least 10 seconds in advance if one of the many conditions required to operate the autopilot will be violated. Upcoming tunnel, construction, etc.


Right. And any accident falls into at least one of these three classes: Something they won't need to pay for (even via insurance premiums), something the software can avoid and lastly, most significantly, something for which the software can provide ten seconds' warning.

I don't find it remarkable that the software has many reasons to disengage. I find it remarkable that potential accidents >10s into the future are on the list of reasons, even in limited circumstances.

The first software I bought came with a warranty that covered nothing: It explicitly said that the software wasn't guaranteed to perform "any particular function". As I read that text, that vendor had the right to sell me three empty floppy disks. You've seen similar texts, right?

And here we have Mercedes guaranteeing considerable foresight in limited circumstances. No matter how limited the circumstances are, that's a giant leap.


I'd expand that to four classes of accidents. Four, something they /will/ need to pay for (both in money, as they're committing to, and in PR). Inevitably some accidents won't fall into your three "preferred" categories -- making a system like this successful is about managing the size of bucket four, not eliminating it entirely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: