I remember the time when Python was the underdog and most of AI/ML code was written in the Matlab or Lua (torch). People would roll their eyes when you told them that you were doing deep learning with Python (theano).
That's a very tenuous analogy. Microcontrollers are circuits that are designed. LLMs are circuits that learned using vast amounts of data scraped from the internet, and pirated e-books[1][2][3].
Apple has an Apple Pay for Donations[1] program, which doesn't apply for rent seeking entities like Patreon. I wonder if Patreon's 10% fee is commensurate with the negligible value that they provide?
gpt-5.2-codex xhigh with OpenAI codex on the $20/month plan got to 1526 cycles with OP's prompt for me. Meanwhile claude code with Opus 4.5 on the team premium plan ($150/month) gave up with a bunch of contrived excuses at 3433 cycles.
The frequencies that they claim affect them are disputable but the flickering in some cheap LED lights is real. Badly/cheaply designed electronics can have flicker as bad as 50 Hz if they use half bridge diode rectification only (e.g. that time I was passing through Geneva airport and the Christmas lights flickered in my peripheral vision)
Indeed, it seems to be an Inflection-AI[1] style acquihire of senior leadership and not an acquisition. MS also entered into a non-exclusive licensing agreement with what was left of Inflection AI after poaching its founders.
reply