Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yup, I feel like the biggest limitation with current AI is that they don't have desire (nor actual agency to act upon it). They don't have to worry about hunger, death, feelings, and so they don't really have desires to further explore space, or make life more efficient because they're on limited time like humans. Their improvement isn't coming inside out like humans, it's just external driven (someone pressing a training epoch). This is why I don't think LLMs will reach AGI, if AGI somehow ties back to "human-ness." And maybe that's a good thing for Skynet reasons, but anyways


They do have desire. Their desire is to help answer human requests.

We can easily program them to have human desires instead.


Desire isn’t really the right word. A riverbank doesn’t desire to route water. It’s just what it does when you introduce water.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: