Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> LLMs are, at their core, search tools.

Fundamentally, no they're not. That is why you have cases like the Air Canada chatbot that told a user about a refund opportunity that didn't exist, or the lawyer in Mata v Avianca who cited a case that didn't exist. If you ask an LLM to search for something that doesn't exist, there's a decent chance it will hallucinate something into existence for you.

What LLMs are good at is effectively turning fuzzy search terms into non-fuzzy terms; they're also pretty good at taking some text and recasting into an extremely formulaic paradigm. In other words, turning unstructured text into something structured. The problem they have is that they don't have enough understanding of the world to do something useful that with structured representation that needs to be accurate.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: