The Indian-born textbook author mentioned (Malkiat Singh [0]) had an inordinate influence on many Kenyan students because his textbooks were the de-facto standard for years. Its interesting how this influence extends as his students get to curate the LLMs on which the world has come to rely.
Having spent enough time with marketing and PR folks, I really wouldn't be surprised if this supposed backlash is overhyped as a way to get more people interested in seeing the ad.
Outrage and clickbait has more than one form and it works surprisingly well on masses, part of orange mans success story. Just look at us discussing it, it wouldn't happen with (much more costly) normal MCD ad.
TLDR
Unfortunately, medicine took a very long time to realize that vitamin D is not simply a vitamin that prevents rickets.
We know today that vitamin D is a powerful nuclear receptor-activating hormone of critical importance, especially to the immune system.
With the available data mentioned above, the proposed doses would probably suffice to maintain vitamin D levels around or over 75-100 nmol/L, with practically zero risk of toxicity.
For code generation especially for larger projects, these models aren't as good as the cutting edge foundation models. For summarizing local git repos/libraries, generating documentation and simple offline command-line tool-use they do a good job.
I find these communities quite vibrant and helpful too:
Since you are on Mac, if you need some kind code execution sandbox, check out Coderunner[1] which is based on Apple container, provides a way execute any LLM generated cod e without risking arbitrary code execution on your machine.
I have recently added claude skills to it. So, all the claude skills can be executed locally on your mac too.
The Qwen3-coder model you use is pretty good. You can enable the LM Studio API and install the qwen CLI and point to the API endpoint. This basically gives you functionality similar to Claude code.
I agree that the code quality is not on part with gpt5-codex and Claude. I also haven't tried z.ai's models locally yet. I think on a Mac with that size GLM 4.5 Air should be able to run.
For README generation I like gemma3-27b-it-qat and gpt-oss-120b.
More importantly than gaining a client for Nvidia's AI chips, this investment gives the company a solid foothold in a competitor to Broadcom in the wireless, datacenter and networking solutions space. I wouldn't be surprised if Nvidia eventually scoops up all of Nokia.
reply