FWIW, I’d bet some coin that current CharGPT would provide a genuine pseudo-random number on request. It now has the ability to recognise when answering the prompt requires a standard algorithm instead of ordinary sentence generation.
I found this out recently when I asked it to generate some anagrams for me. Then I asked how it did it.
In the context of gambling, random numbers or prngs can't have any unknown possible frequencies or tendencies. There can't be any doubt as to whether the number could be distorted or hallucinated. A pseudo random number that might or might not be from some algorithm picked by GPT is wayyyy worse than a mersenne twister, because it's open to distortion. Worse, there's no paper trail. MT is not the way to run a casino, or at least not sufficient, but at least you know it's pseudorandom based on a seed. With GPT you cannot know that, which means it doesn't fit the definition of "random" in any way. And if you find yourself watching a player getting blackjack 10 times in a row for $2k per bet, you will ask yourself where those numbers came from.
I think you're missing the point. Current incarnations of GPT can do tool calling, why shouldn't they be able to call on a CSPRNG if they think they'll need a genuinely random number?
This test was conducted with Android & Firefox 128, both Chatgpt sessions were not logged in, yet normal browsing holds a few instances of chatgpt.com visits.
Yeesh, that's bad. Nothing ever repeats and it looks like it makes sure to use every number in each sequence of 10 before resetting in the next section. Towards the end it starts grouping evens and odds together in big clumps as well. I wonder if it would become a repeating sequence if you carried it out far enough?
I found this out recently when I asked it to generate some anagrams for me. Then I asked how it did it.