Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Calling the sentence or two it arbitrarily saves when you statd your preferences and profile info "memories" is a stretch.

True equivalent to human memories would require something like a multimodal trillion token context window.

RAG is just not going to cut it, and if anything will exacerbated problems with hallucinations.



Well, now you’ve moved the goalposts from “learn anything” to “learn at human level”. Sure, they don’t have that yet.


Thats the whole point of llama index? I can connect my LLM to any node or context i want. Syncing it to a real time data flow like an API and it can learn...? How is that different than a human?

Once optimus is up an working by the 100k+, the spatial problems will be solved. We just don't have enough spatial awareness data, or for a way for the LLM to learn about the physical world.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: