Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What you're talking about is by definition no longer facts but opinions. Even AGI won't be able to turn opinions into facts. But LLMs are already very good at giving opinions rather than facts thanks to alignment training.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: