I agree with you that it's great to have a comprehensive stdlib but I think the process should be careful and gradual. Python's stdlib has a lot of stuff that is not really best-in-class or consistent in design. I think with today's ubiquity of package managers, we could get away with a smaller stdlib initially, and take time to validate libraries with real world usage before blessing them with official status.
Most languages would probably benefit from having a “experimental for the standard lib” package that doesn’t guarantee perfection, and might get deprecated then abandoned before making it to the stdlib. But all new features go through that library first.
That experimental library could even be backed by other packages, so you could iterate on the interface while allowing other packages to continue to improve the core. But the assumption is for any clearly solved problem, these “batteries” provide one recommended way to solve the problem, and as of now it’s either the only or at least not the worst way to do it.
Claiming that a language user should be prepared for certain parts of the language to disappear is a perfectly reasonable idea. But in practice it does not work. Because either it can or cannot be used, and if it can't be used it is a waste of time, if it can be used it must be supported.
Once people use a library, there will be pressure to support it forever and the mindshare switch is a real cost. A language bleeds users every time it drops a feature. The Python namespace is still polluted by documentation and tutorials talking about how to do things in v2 too.
Rust seems to handle it ok. Features can spend a long time being nightly-only gated behind feature flags, while nevertheless seeing real world usage from people willing to brave the risk and churn.
But Python has a very different release schedule. Rust's nightly releases nightly and stable every six weeks, while Python only releases once a year. I don't think it could adopt the same model.
>Claiming that a language user should be prepared for certain parts of the language to disappear is a perfectly reasonable idea. But in practice it does not work
Well, it can. That's what you get when there's no official package and you depend on third party packages that change all the time (e.g. common in the Node ecosystem, in GTK+ land, etc.), or are abandoned, etc.
Compared to that, a scheduled, anticipated process for candidate packages to include in the language (or remove eventually) is much easier.
Indeed, but from your own original comment - we have hindsight. So we can build an excellent std lib from the get go.
There are other bundles that are very popular for a reason. Conda. That's all you need to know. People want stability of dependencies, not wild wild west on pulling libs from random github repos.
IMO the bundling and curation is the least interesting part of Conda. What's really nice are the build infrastructure, quality checks and proper dependency bounds (looking at you, PyPI). Something like R's CRAN already has those without bundling.
And now we have Mamba, which is just like Conda. Except it has a better-documented (and more open) tooling ecosystem. And it lacks the debilitatingly slow/unreliable dependency resolution of Conda.