Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It happily made up citations for me. In a follow up, I asked it not too, and to please use only real papers. It apologized, said it would not do it again, then in the same reply made up another non-existent but plausible citation.

Checking the links is a good practice.

I feel like we just created an interesting novel problem in the world. Looking forward to seeing how this plays out.



Are you talking about Bing Chat, which cites actual web pages it used to make the summary, or ChatGPT, which is a very different beast and relies on built-in knowledge rather than searches?


Good call. I was using ChatGPT.


Sounds like you should be doing the research yourself but are relying on an untrusted source to feed you answers? I don't think we're there yet...


On the contrary, I was doing a calibration, asking about something to which I know the answers very well. To see if it was trustworthy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: