Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's so strange when it obviously hits a preprogrammed non-answer in these models, how can one ever trust them when there is a babysitter that interferes in an actual answer. I suppose that asking it what version it is isn't a valid question in it's training data so it's programmed to say check the documentation, but still definitely suspicious when it gives a non-answer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: