Not if you are selling hardware. If I was Apple, Dell, or Lenovo, I would be pushing for local running models supporting Hugging Face while I full speed developed systems that can do inference locally.
Local models do make a lot of sense (especially for Apple), but it's tough to figure out a business model that would cause a company like OpenAI to distribute weights they worked so hard to train.
Getting customers to pay for the weights would be entirely dependent on copyright law, which OpenAI already has a complicated relationship with. Quite the needle to thread: it's okay for us to ingest and regurgitate data with total disregard for how it's licensed, but under no circumstances can anyone share these weights.
> Getting customers to pay for the weights would be entirely dependent on copyright law
That's assuming weights are even covered by copyright law, and I have a feeling they are not in the US, since they aren't really a "work of authorship"
> it's tough to figure out a business model that would cause a company like OpenAI to distribute weights they worked so hard to train.
It sounds a lot like the browsers war, where the winning strategy had been to aggressively push (for free, which was rather uncommon then) one's platform, in the aim of market dominance for later benefits.
Provide the weights as an add-on for customers who pay for hardware to run them. The customers will be paying for weights + hardware. I think it is the same model as buying the hardware and get the macOS for free. Apple spends $35B a year in R&D. Training GPT5 cost ~$500M. It is a nothing burger for Apple to create a model that runs locally on their hardware.
That is functionally much harder to pull off than software because model weights are essentially more like raw media files than code, and that is much easier to convert to another runtime
You can still extract the model weights from an on-prem machine. It has all the same problems of media DRM, and large enterprises do not accept unknown recording and surveillance that they cannot control
I am not sure what you mean. I work at a large Enterprise and we did not unleash it on our baseline and it couldn’t phone home but it was really good for writing unit tests. That sped things up for us.
Not if you are selling hardware. If I was Apple, Dell, or Lenovo, I would be pushing for local running models supporting Hugging Face while I full speed developed systems that can do inference locally.