Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure - this isn’t to imply that that can’t happen or even that that isn’t the more common case.

It’s only to say that just because something may seem far away it may not be - even if you’re the person that will invent it only two years later (and therefore are probably in the best position to know).

Given the high stakes of unsafe AGI and this uncertainty it’s probably worth some people working on goal alignment.

This is somewhat unrelated to the recent release though.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: