Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
sigmoid10
on June 5, 2024
|
parent
|
context
|
favorite
| on:
Legal models hallucinate in 1 out of 6 (or more) b...
What you're talking about is by definition no longer facts but opinions. Even AGI won't be able to turn opinions into facts. But LLMs are already very good at giving opinions rather than facts thanks to alignment training.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: