Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think by

> LLMs are unable to reason about the underlying reality

OP means that LLMs hallucinate 100% of the time with different levels of confidence and have no concept of a reality or ground truth.



Confidence? I think the word you’re looking for is ‘nonsense’




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: