Not even that. It's make stuff up engine. Search engines are able to return "no results found".
Current incarnation of GPT chat can tell you it doesn't know something, like current net worth of Elon Musk, but it's definitely canned answer that crudely patches the underlying model response which is never empty.
Try asking it about anything about real life people. You'll often encounter a sentence or two of agenerated test but then very similarily phrased excuse why it can't know any better as the "shepard" however it is implemented kicks in.
Not even that. It's make stuff up engine. Search engines are able to return "no results found".
Current incarnation of GPT chat can tell you it doesn't know something, like current net worth of Elon Musk, but it's definitely canned answer that crudely patches the underlying model response which is never empty.