Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They don’t understand anything, they just have text in the training data to answer these questions from. Having existential crises is the privilege of actual sentient beings, which an LLM is not.




They might behave like ChatGPT when queried about the seahorse emoji, which is very similar to an existential crisis.

Exactly. Maybe a better word is "spiraling", when it thinks it has the tools to figure something out but can't, and can't figure out why it can't, and keeps re-trying because it doesn't know what else to do.

Which is basically what happens when a person has an existential crisis -- something fundamental about the world seems to be broken, they can't figure out why, and they can't figure out why they can't figure it out, hence the crisis seems all-consuming without resolution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: