No. Apple makes money from subscriptions, and Apple controls the app store with an iron fist. In a hypothetical world where the price of apps falls to ~free, Apple will just ban free apps in order to recoup that revenue. Hell, if you think that you can build apps for free, then you need to explain why Apple wouldn't do the same themselves, charge users a recurring fee for the privelige of using them, and then muscle out any competitors using their natural monopoly to reap the profits for themselves. Apple doesn't work for your benefit; like every other paperclip maximizer, they have a sociopathic focus on profit at all costs.
No, services that lean too heavily on Github routinely receive messages from the admins asking them to do what they can to dial back usage of server resources. It happened to Homebrew, it happened to Rust, and it happened to various other projects who thought they were being good citizens by recommending shallow clones (turns out, shallow clones are somehow harder on the server than full clones).
> Very much so once you compare it to how quickly C++ (and, in fact, any language that's ever been in the top 5 or so) achieved similar milestones.
No, this completely overestimates how quickly languages gain prominence.
C came out in 1972 and didn't gain its current dominance until approximately the release of the ANSI C spec in 1989/1990, after 17 years.
C++ came out in 1985 and didn't become the dominant language for gamedev until the late 90s (after it had its business-language-logic niche completely eaten by Java), after 14 years or so.
Python came out in 1991 and labored as an obscure Perl alternative until the mid-late 2000s, after about 16 years (we can carbon-date the moment of its popularity by looking at when https://xkcd.com/353/ was released).
Javascript came out in 1995 and was treated as a joke and/or afterthought in the broader programming discourse until Node.js came out in 2009, after 14 years.
Rust is currently 11 years old, and it's doing quite excellently for its age.
> C came out in 1972 and didn't gain its current dominance until approximately the release of the ANSI C spec in 1989/1990
While it kept growing in popularity later, by 1983-5 C was already one of the top programming langugages in the world.
> C++ came out in 1985 and didn't become the dominant language for gamedev until the late 90s
Major parts of Windows and Office were being written in C++ in the early-mid 90s, before C++ turned 10. Visual C++, one of Microsoft's flagship development products, came out in 1993. Huge mission-critical, long-term, industrial and defence projects were being written in C++ during or before 1995 (I was working on such a project).
> Python came out in 1991 and labored as an obscure Perl alternative until the mid-late 2000s
Even in 2002 Python was widespread as a scripting language. But it is, indeed, the best and possibly only example of a late bloomer language.
> Javascript came out in 1995 and was treated as a joke and/or afterthought in the broader programming discourse until Node.js came out in 2009
AJAX (popularised by Gmail) pretty much revolutionised the web in 2004. When jQuery came out in 2006, JS was all over the place.
> Major parts of Windows and Office were being written in C++ in the early-mid 90s, before C++ turned 10.
Major parts of Windows, Android, and Linux were being written in Rust before it turned 10. Major parts of AWS were being written in Rust before it turned 4. Major parts of Dropbox were being written in Rust before it turned 1. So you agree by your own criteria that Rust is a major language?
> While it kept growing in popularity later, by 1983-5 C was already one of the top programming langugages in the world.
In the mid-80s, C still had plenty of major and healthy competitors, as pjmlp will imminently arrive to remind you. By the criteria of mid-80s C, Rust is already one of the top programming languages in the world.
> AJAX (popularised by Gmail) pretty much revolutionised the web in 2004. When jQuery came out in 2006, JS was all over the place.
No, despite the existence and availability of freestanding interpreters (e.g. Rhino), Javascript was an also-ran everywhere except the web; which is to say, nobody was choosing to use Javascript except the people forced at gunpoint to use it. There are infinitely more people choosing to use Rust at the age of 11 than were choosing to use Javascript at the age of 11, which means that, once again, by your criteria, you must consider Rust a major language. You can just admit it instead of being a tsundere.
No, this is silly. You can look at the availability and maturity of toolchains, the rate of release of projects written in that language, the rate of release of books and learning materials, the rate at which universities begin teaching the language, the volume of discourse devoted to that language in magazines and the online venues which did exist (e.g. Usenet), and crucially the declining metrics of all of the above for the direct competitors of that language.
> The resulting constraint is roughly similar to that of never ever breaking ABI in C++.
No, not even remotely. ABI-stability in C++ means that C++ is stuck with suboptimal implementations of stdlib functions, whereas Rust only stabilizes the exposed interface without stabilizing implementation details.
> Unfortunately editions don't allow breaking changes in the standard library
Surprisingly, this isn't true in practice either. The only thing that Rust needs to guarantee here is that once a specific symbol is exported from the stdlib, that symbol needs to be exported forever. But this still gives an immense amount of flexibility. For example, a new edition could "remove" a deprecated function by completely disallowing any use of a given symbol, while still allowing code on an older edition to access that symbol. Likewise, it's possible to "swap out" a deprecated item for a new item by atomically moving the deprecated item to a new namespace and making the existing item an alias to that new location, then in the new edition you can change the alias to point to the new item instead while leaving the old item accessible (people are exploring this possibility for making non-poisoning mutexes the default in the next edition).
Only because Rust is a source only language for distribution.
One business domain that Rust currently doesn't have an answer for, is selling commercial SDKs with binary libraries, which is exactly the kind of customers that get pissed off when C and C++ compilers break ABIs.
Microsoft mentions this in the adoption issues they are having with Rust, see talks from Victor Ciura, and while they can work around this with DLLs and COM/WinRT, it isn't optimal, after all Rust's safety gets reduced to the OS ABI for DLLs and COM.
I'm not expecting to convince you of this position, but I find it to be a feature, not a bug, that Rust is inherently hostile to companies whose business models rely on tossing closed-source proprietary blobs over the wall. I'm fairly certain that Andrew Kelley would say the same thing about Zig. Give me the source or GTFO.
In the end it is a matter of which industries the Rust community sees as relevant to gain adoption, and which ones the community is happy that Rust will never take off.
Do you know one industry that likes very much tossing closed-source proprietary blobs over the wall?
Game studios, and everyone that works in the games industry providing tooling for AAA studios.
> Game studios, and everyone that works in the games industry providing tooling for AAA studios.
You know what else is common in the games industry? C# and NDA's.
C# means that game development is no longer a C/C++ monoculture, and if someone can make their engine or middleware usable with C# through an API shim, Native AOT, or some other integration, there are similar paths forward for using Rust, Zig, or whatever else.
NDA's means that making source available isn't as much of a concern. Quite a bit of the modern game development stack is actually source-available, especially when you're talking about game engines.
Rust allows binary libraries with a C ABI. Having safety within any given module is still a big deal, and it's hard to guarantee safety across dynamic modules when the code that's actually loaded can be overridden by a separately-built version.
Who is "they"? The employees at Apple when the HIG was first published in 1986, 40 years ago? That Apple is dead, what you see before you is an empty and rotted husk.
When I began at Apple in 1995, we followed "Tog on Interface" to the letter. It was not uncommon to expect arguments over what the Right way was during lunch.
I watched as Steve Jobs came back to Apple—he really took hold of the reins of UX (aided by his team of designers).
Personally, (and I say this as it is often a matter of taste) I didn't care for a lot of it.
A simple example: the URL field of Safari should have been, to my Tog sensibilities, an editable text field only. Perhaps somewhere (below?, to the right?) you might include a progress bar to indicate the page loading. But a designer (I will not name, ha ha) came up with a combined textfield/progress bar. It looked to my eye as though, as the page loaded, the text was being selected!
Jobs loved it though.
It was then I think that Apple departed "Tog" for these "one-off" UX experiments.
I have rationalized this move away from a standard since, with the advent of the web, the customer is now being bombarded with all manner of UX—ought to be comfortable with one-off UX.
(Thankfully I see that now we have a thin line that seems to grow along the lower edge of the URL field.)
Almost by definition, you have ABI stability as long as all artifacts are compiled with the same version of the same toolchain on the same platform with all the same compiler flags. That's not enough to let you generally ship pre-compiled artifacts with the intent for users to dynamically link them, but it might be enough to let you leverage dynamic linking for hot-reloading within a single developer's workflow.
The average total cost of car ownership in the US in 2025 was about $12,000. $9,000 is already a huge underestimate of what the average person is paying.
I lived in a blizzard ridden area using just a 250cc motorcycle, year round, including riding it on the interstate. Layer enough layers, use heated gloves, etc you can easily get by with just a ninja 250, you're not going to burn more than $3-4k a year on that no matter hard you try.
You don't actually need a car unless you have a child or a tradesmen with tools or something like that, a small displacement motorcycle will still take you to 99.9% of the jobs in the lower 48.
I'm not sure if I'd call the above comment cable-news-brained, but it's entirely possible to push a misleading or outright false narrative while only presenting factually true statements. Remember, nearly everyone who's ever died has had a history of exposure to dihydrogen monoxide.
Not only that: it's impossible to report on everything that happened, so any outlet only reports on the important stuff. What is and what isn't considered important is a matter of bias, too.
All true. But you can choose what and how you report in order to give one side a boost, or you can choose what and how to report in order to give the best, most accurate picture you can of what's actually going on. The difference matters.
Yeah, nobody ever does it perfectly. But trying to do it right rather than trying to do it wrong surely means that you'll come closer to doing it right.
Well, at least thus far, the only reason my life is worse due to AI is because of all the people who won't stop talking about how amazing it is for vibe-coding everything from scratch despite ample empirical evidence to the contrary.
Until and unless there are some more significant improvements in how it works with regard to creating code, having strong "manual" programming skills is still paramount.
>Are the people leveraging LLMs making more money while working the same number of hours?
Nobody is getting a raise for using AI. So no.
>Are the people leveraging LLMs working fewer hours while making the same amount of money?
Early adopters maybe, as they offload some work to agents. As AI commodifies and is the baseline, that will invert, especially as companies shed people to have the remaining "multiply" their output with AI.
Well they don't call it being a wage slave for nothing. You aren't getting a raise because you're still selling the same 40-60 hours of your time. If the business is getting productivity wins they'll buy less time via layoffs.
(USSR National Anthem plays) But if you owned the means of production and kept the fruits of your labor, say as a founder or as a sole proprietor side hustle, then it's possible those productivity gains do translate into real time gains on your part.
>But if you owned the means of production and kept the fruits of your labor, say as a founder or as a sole proprietor side hustle, then it's possible those productivity gains do translate into real time gains on your part.
Not even then: since it will commodify your field, and make any rando able to replicate it.
The very reason why we object to state ownership, that it puts a stop to individual initiative and to the healthy development of personal responsibility, is the reason why we object to an unsupervised, unchecked monopolistic control in private hands. We urge control and supervision by the nation as an antidote to the movement for state socialism. Those who advocate total lack of regulation, those who advocate lawlessness in the business world, themselves give the strongest impulse to what I believe would be the deadening movement toward unadulterated state socialism.
At some FAANG companies, using AI is now part of the role profile against which your performance and compensation is assessed. So, yes, some engineers are technically getting a raise for using AI.
Did high-level languages and compilers make life better for working programmers? Is it even a meaningful question to ask? Like what would we change depending on the outcome?
Lots of people have jobs today thanks to high level languages that wouldn't have a job before them, they don't need to know how to manage memory manually.
Maybe that will happen for LLM programming as well, but I haven't seen many "vibe coder wanted" job ads yet that doesn't also require regular coding skills, so today LLM coding is just a supplementary skill its not a primary skill, so not like higher level languages since those let you skip a ton of steps.
Of course not. In the world of capitalism and employment, money earned is not a function of productivity, it is a function of competency. It is all relative.
Oh you sweet summer child. Under capitalism money is a function of how low you can pay your fungible organic units before they look for other opportunities or worse, unionize (but that can be dealt with relatively easily nowadays). Except for a few exceptional locations and occupations, the scale is tilted waaay against the individual, especially in the land of the free (see H-1B visas, medical debt and workers on food stamps).
(See also the record profits or big companies since Covid).
> ow low you can pay your fungible organic units before they look for other opportunities or worse
This is what I meant? The more replace-able you are, the lowest you can be paid before you look for other opportunities. And, of course, yes, it is absolutely tilted against the individual.
> Are the people leveraging LLMs making more money while working the same number of hours?
> Are the people leveraging LLMs working fewer hours while making the same amount of money?
Yes, absolutely. Mostly because being able to leverage LLMs effectively (which is not "vibe coding" and requires both knowing what you're doing and having at least some hunch of how the LLM is going to model your problem, whether it's been given the right data, directed properly, etc.) is a rare skill.
reply