Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a programmer, the thought terrifies me.


As someone who tries to do risk analysis, the prospect of sticking with human drivers because of fear of software bugs (which inevitably will kill, just in much smaller numbers) terrifies me.


The fear is not of bugs killing people. The fear is of bugs allowing people to kill people.

If ISIS was able to hack a major fleet through one such bug, do you think for a single moment they wouldn't make use of it to kill many people?


Which is a legitimate fear, and substantial effort should go into preventing such bugs, but any sufficiently determined person doesn't need to exploit a software bug to be able to kill others. ISIS appears to be quite effective at simply convincing its members to directly and voluntarily engage themselves in such acts.


Yeah, but we're dealing with an order of magnitude difference if we give them control over > 1Million mobile bombs (ie 1 large self-driving car network).


Unless it was somehow possible to remotely trigger a battery explosion, no, we would not be giving anyone control over “mobile bombs”.


Could a programmer be held liable for any bad outcomes?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: