Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Those are based on similar online conversations. If lots people insist they are right in conversations ChatGPT thinks are similar then it will insist it is right, if many give up then it will give up.

ChatGPT has no other way to gauge "confidence" in the outputted text, the computed confidence you get has nothing to do with how truthful the statement is, but how well the text fits given the examples ChatGPT has seen. A person insisting that a wrong statement is right could fit better and then ChatGPT would give that statement high confidence. But still the number computed is 100% unrelated to the tone it responds in.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: