> And at the time, it was a complete waste of money. I don’t regret it at all.
I think, spending 50 bucks on bitcoin back then wouldn't have been a complete waste of money: in the worst case, you could have kept them as a souvenir from a passing fad to tell your grandchildren about. Like pet rocks.
However, I heard about bitcoin around the same time, and even read the whitepaper etc (I'm a mathematician and did some actual cryptography at the time), but still didn't buy any.
> This comes up now as “is vibecoding sane if LLMs are nondeterministic?” Again: do you want the CS answer, or the engineering answer?
Determinism would help you. With a bit of engineering, you could make LLMs deterministic: basically, fix the random seed for the PRNG and make sure none of the other sources of entropy mentioned earlier in the article contribute.
But that barely impact any of the issues people bring up with LLMs.
If you've got a fixed GPU that doesn't degrade at all during the process, I think? If you switch GPUs (even another one of the same model) or run it long enough the feed-forward of rounding will produce different results, right?
The rounding itself is, but the operations leading to what gets rounded are not associative and the scheduling of the warps/wavefronts isn't guaranteed.
And determinism isn’t particularly helpful with compilers. We expect adherence to some sort of spec. A compiler that emits radically different code depending on how much whitespace you put between tokens could still be completely deterministic, but it’s not the kind of tool we want to be using.
Determinism is a red herring. What matters is how rigorous the relationship is between the input and the output. Compilers can be used in automated pipelines because that relationship is rigorous.
The problem you pointed out is real, but determinism in compilers is still useful!
Suppose you had one of those widely unstable compilers: concretely if you change formatting slightly, you get a totally different binary. It still does the same thing as per the language spec, but it goes about it in a completely different way.
This weak determinism is still useful, because you can still get reproducible builds. Eg volunteers can still audit eg debian binary packages by just re-running the compiler with the exact same input to check that the output matches. So they can verify that no supply chain attack has fiddled with the binaries: at least the binaries belong to the sources the are claimed to.
The argument is that determinism in compilers isn't particularly important for building software because we did without it for a long time.
Your argument would be... that building software isn't particularly important for building software...?
The actual argument you'd be making would be something like, building software isn't particularly important for survival. Which is pretty obviously true, for the reason you state.
And the reason that relationship can be regiorous is because compilers by definition translate one formal language to another. You can’t have a compiler that translates English to machine code in a rigorous, repeatable manner because English is ambiguous.
reply