Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes! I have a feeling that as you define the goal of "something that has human-like intelligence but isn't human" more clearly, the less sense it will make because it will become clear that our idea of intelligence depends heavily on the specific ways that humans relate to the world. I wrote a blog post on this - https://gushogg-blake.com/posts/agi


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: