Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rust in 2022 (ncameron.org)
125 points by pjmlp on Jan 12, 2022 | hide | past | favorite | 248 comments


I've always wanted to learn Rust but have found it devilishly complex compared to the likes of Go, but I appreciate there's not much that can be done about its complexity this far down the road.

My other big beef with Rust is its lack of a comprehensive stdlib, unlike Go. Having to pull in tons of external crates of varying quality and with varying degrees of dependency chains is not particularly pleasant.

Having a decent, well maintained, stdlib that contains "the basics" that one would expect in any 21st century programming language is sorely missing from Rust, and from my personal perspective is something that would draw me back to Rust for another go at learning it.


Actually, the "batteries included" school of library design was a 20th century idea; the trend in the 21st century in all other ecosystems very much goes in the opposite direction: fewer features in the stdlib to allow faster iteration, and strong package managers that make it seemless to mix and match packages.

Go just dodges that trend and does the 20th century thing, as it does for many other design decisions. Just as it has common data structures as hardcoded language features, instead of the ability to build them as abstractions.

To be clear: One can take the position that the trends go in the wrong direction and Go is right to reverse them -- that's a matter of opinion. I just don't think it's fair to argue Go is the language that does the more modern thing.


Hard disagree.

The main reason I've wrote so much python in the last 10 years is due to it's stdlib being comprehensive enough to do 95% of the tasks I need without adding any dependency.

This despite the fact that I never was super-fond of python's syntax and ideology, although I just learned to accept it. I was writing a ton of perl before that, and with python I could do more with less "CPAN". I would have _never_ made the switch without.

A good stdlib ensures decent performance, stability and overall code quality to your project. Without that, you need tons of dependencies along with version pinning (as developers nowdays understand semver even less than 20 years ago) and duplication due to your dependencies picking different packages as their own base due to the lacking stdlib.

I hate the sort of ecosystem which is built around this pattern. See npm for one extreme, although I see cargo not too far away in the horizon.


Ok but you didn't contest anything in their argument as far as I can tell.


I think this mindset is exactly what Rust's stdlib encapsulates, where the differentiator comes is "scope."

As an example, most of STD (the stuff outside core) makes the assumption that a nonfallible allocator exists. This is an entirely reasonable assumption for any high level programming language... yet it's one of the biggest complaints about Rust's standard library from systems programmers. And so a not-insignificant amount of (mostly volunteer) developer time has been spent modifying very basic data structures and APIs to support the (not so niche) use case of fallible and pluggable allocators.

If you're working on a stdlib where the question, "what if my allocator doesn't work" is meaningful, it's not a big jump to much more complex questions about how to implement "standard" features in other languages that simply don't have the same concerns as a systems language like Rust - making it much more reasonable to shove off the responsibility of implementing those features in libraries out in the ecosystem.

And I think that's wholly in line with the space that Rust occupies, which is next to C++ (which also has a lacking stdlib for "standard" features), but at least has a good enough package manager to deal with it.


> A good stdlib ensures decent performance, stability and overall code quality to your project.

How then? I can give counter examples of the opposite easily:

An extensive stdlib ensures decent performance: https://stackoverflow.com/questions/31548680/why-is-python-3...

An extensive stdlib ensures stability: i'm not even sure how, with good version pinning it should always work, regardless of the it being std lib or otherwise.

An extensive stdlib ensures overall code quality to your project. This one I agree most with, but because the std lib sets the idiomatic writing style of the project. I'm not how important the size of the std is for this to work.


> An extensive stdlib ensures decent performance: https://stackoverflow.com/questions/31548680/why-is-python-3...

This example is about a language issue, not the stdlib. And it's only slower on purely computational code, which Python is known to be bad at anyway, as a tradeoff to avoid overflow issues.

> An extensive stdlib ensures stability: i'm not even sure how, with good version pinning it should always work, regardless of the it being std lib or otherwise.

Version pinning is a bad practice because it slows deployment of security updates. It also only works for end applications, as libraries cannot reasonably pin their own dependencies.


In rust an int is an int, in Python this changes with versions. Std lib needs to change along.

> Version pinning is a bad practice because it slows deployment of security updates.

How is that not true for stdlibs?

> It also only works for end applications, as libraries cannot reasonably pin their own dependencies.

Really?


> In rust an int is an int

And Rust ints can overflow. It's a tradeoff.

>> Version pinning is a bad practice because it slows deployment of security updates.

> How is that not true for stdlibs?

Because, if without pinning, the package manager always gets the latest available version and installs it for all applications on the system once and for all. When pinning dependencies, system administrators need to wait for every single application to update its pin.


> And Rust ints can overflow.

There are provisions to do math that won't overflow or that can saturate. For example https://doc.rust-lang.org/std/primitive.u32.html#method.chec...


Sure, but it's less convenient to use.


Do we want to always get the latest version? The whole colors.js and faker.js thing from yesterday comes to mind as a very good reason to not want to auto-update to the latest version of a lib.


That's why I install from curated package repositories (eg. Debian) as much as possible, and encourage users of my own software to do so as well.


>> It also only works for end applications, as libraries cannot reasonably pin their own dependencies.

> Really?

At least not usefully if types from those dependencies appear on the API surface.


Just try to do anything that involves any kind of heavy calculation with Python and tell me it has "decent performance". If you write such code in pure python it can easily be up to 1000x slower than a compiled language or using dedicated libraries that link C/C++/Rust libraries to Python. This is why you need to immediately reach for numpy and similar libraries that aren't part of the standard library.


Software does not always require "heavy calculations". That is the case in majority of applications today. You would be surprised as to what can be made in Python using stdlib only.

Ironically, Python makes the best desktop calculator. Go figure.


This discussion is about the standard library, not the language runtime. Decent performance here is meant relative to the performance of the language. In this case it could mean algorithmic complexity, for example.


Go has a much easier time of it for including some batteries, because it’s a simpler language.

Take HTTP, for example: the decision of the data type to use for storing headers is easy, because there are really only four possibilities, two binary decisions to make: ① do you use string or []byte; and ② do you use map[T][]T, or join values with commas and special-case that pesky Set-Cookie header somewhere and use map[T]T? And so map[string][]string was fairly obviously the best choice from the start, and I think still is—in Go.

Meanwhile, in Rust, are you going to go Vec<(Vec<u8>, Vec<u8>)>, can you get away with HashMap<String, Vec<String>>, do you store a central buffer and work with slices for fewer allocations, or do you say “no, that’s stupid, headers aren’t strings, they’re just serialised as strings” and go all fancy typey in one of a variety of ways, some the most efficient of which depended on GATs which are only recently stable?

Go could have net/http from the start. I don’t know if there’s something better now (I don’t use Go), but I expect that the better would still be very similar in API.

If Rust had had a std::net::http at 1.0, it would certainly have been deprecated by now. Probably even over four years ago.

(Disclosure: I wrote the first de facto standard HTTP library in Rust, rust-http, back in 2013. Then in 2014, realising its design was a dead end, I went to write a new one from scratch, Teepee, but never finished it because of decision paralysis, others took up much of my work and finished the concept off in Hyper, and I’m glad I got out of the critical path and I’m glad they took it up.)


Then you go “Can you multiply a duration with a number? No, that wôuld be stupid. Can you multiply a duration with a duration? Heck yeah!” Such an intelligent type system.


Hmm? std::time::Duration doesn’t implement Mul<Duration>, and does implement Mul<u32>, and has methods mul_f32 and mul_f64. So you can multiply a duration with a number; and you can’t multiply a duration with a duration because that would be stupid.


I was talking about Go.


Oh wow, that is nasty.


I think it's worth expanding this a little more.

The problem with standard libraries in general is that it's not really possible to adjust the API later if it turns out to not be well-designed. And it turns out that "oops, we got the API wrong" afflicts, well, literally every standard library I've ever worked with (note: this even does include Rust, as young as it is). Examples include:

* C/C++: wide strings, locale support, std::unordered_map (it prevents you from using a good hashtable implementation)

* Python: mail parsing libraries, http requests [and this even after having an exceptionally painful major version that allowed them to break compatibility!]

* Java: java.util.Date, java.util.Vector [as opposed to java.util.ArrayList]

* Rust: raw FD support, std::io::Read interactions with partially-uninitialized buffers

There is something you notice when you look at standard library misfeatures, which is there are different kinds of issues. The simplest issue is something like locale support in C/C++: you can just simply not use it, and everything will be fine. On the other hand--and this afflicts Python the most of any language I'm familiar with--having a broken implementation in stdlib can dissuade the community from making a good version of what you want (this is what happens with the mail format stuff). The worst possible kind, though, is when you make a core type with an utterly broken API--and this is what java.util.Date qualifies as.

Having "batteries included" makes it more likely you get more severe kinds of API failures in the standard library. Already, the Rust ecosystem has had evolution in core package libraries that allowed it to absorb poor API design--consider the shift from error-chain to thiserror/anyhow, or the futures 0.1->0.3 change.


But the comparison here is between the organization of the Rust library ecosystem and Go's; the Go standard library has, I think, more (and more ambitious) strongly-accepted abstractions --- it's uncontroversial to observe that Rust programs pull in more deps than Go packages pull in modules. But the phenomenon you refer to in your last 2 paragraphs hasn't really occurred with Go; it's hard to point to something in the stdlib that Go just straight-up botched, to the detriment of all other packages.

Go programs pull in deps, of course, but it's unusual bordering on unidiomatic to pull in a dep that changes the general programming environment (idiomatic Go sticks with the data structures and, notably, concurrency primitives in the stdlib, net/http remains the de facto standard HTTP interface, things tend to build on top of database/sql instead of replacing it, for the most part the stuff in encoding/* is canon, &c). That's not a plus for Go, but it's a contrast with Rust, where crates routinely redefine or refine the underlying Rust programming environment (best example is anyhow, of course). So Go programs rely heavily on the stdlib and its abstractions. And my perception is that by and large Go devs are pretty happy with it?

Maybe DNS would qualify as such a botch; maybe the interaction between sockets and contexts as well. And we'll see what chaos erupts from 1.18 generics.

(None of this should be construed as in any way a dig on Rust, which I find frustrating but also enjoyable to work in, and whose expressiveness and utility advantages over Go I respect).


I think both you and the parent are correct. I don't think there is one answer to this question for both languages. I do think the right answer for Rust is to "wait and see." Let libraries organically develop, and wait to pull them in at the right time, to, for lack of a better word, die.


As a counterpoint, we can look at the npm ecosystem for the disadvantages of a "just use small packages from the ecosystem" approach. Look how many dependencies a typical npm package has, how many different authors there are, and what happens when one of them goes off the deep end?


"All other ecosystems"? Not sure about that - besides Python, which another commenter mentioned, there is also PHP, which is pretty much 21st century and also "batteries included". I totally get it that small teams developing a language (Node, Rust) need to focus their efforts, but please don't try to spin this limitation into a "new way of doing things". What I hate most about this "package manager"-based way is not only having to choose between a multitude of options, each with its own pros and cons, for basic functionality, but also the burden of keeping everything safe and up to date (while making sure it's still working), and the constant threat of having to switch to another package/crate if the one you have decided to use is abandoned.


Talking about centuries in programming is pretty silly. A large number of popular languages (Java, C#, Python, PHP) of a certain age went down the batteries included route. All of them have "dead batteries" in their standard libraries that can't be removed because of backwards compatibility.

At this point more of Java's standard library shouldn't be used than should. Java literally has multiple UI implementations in the standard lib and none of them are the recommended method of UI development in Java.

As far as I am concerned the only thing worse than having to use a package manager to make choices, is having to use the package manager to make choices to replace the dead ends codified in the standard library.


The batteries included aspect of PHP is very 1990s. It's the thing I always disliked the most about PHP in that so much of the functionality required PHP to be built with all kinds of stuff compiled in and in shared hosting environments you would be at the mercy of however the admins built their PHP (not to mention having to deal with the overhead of having loads of code that you might not want in your system). Maybe it's gotten better since I was writing PHP code 14 years ago, but ugh.


<non-judgmental> PHP and Python were both released in the 90s


I don’t think this is right? C, C++, and JavaScript certainly had very spartan standard libraries. Python and Java certainly emerged in the mid 90s (within striking distance of the 21st century), but I’m not sure the state of their standard libraries at the time. I suspect other languages also had smaller libraries but can’t prove it.

With respect to mainstream 21st century programming languages, I’m only familiar with Go, Rust, and C#, of which C# and Go certainly have pretty comprehensive standard libraries. Other mainstream 21st century languages I can think of run atop either JVM or .Net and I believe they have access to the Java or C# standard libraries, respectively but I’m not sure if it would be fair to say they inherit those standard libraries or not.


I'm not sure those are counterexamples.

The discussion is about large standard libraries vs. package managers. C and C++ are even older languages that had neither. Back then, the idea of having a large abstraction layer between the language and the OS wasn't even thinkable -- that idea didn't really appear before the 90s.

If you don't know the size of Java's standard library in the 90s, it's maybe risky to imply that it must have been small. :) Kidding a bit. But to give a hint, it included two different incompatible GUI frameworks by 1998.

JavaScript is from the 90s, but wasn't a standalone programming language until Node.js, which was released in 2009.

C# is the newest non-Go language you mention with a large standard library. It was released in 2000 (so in the 20th century if we're being pedantic), and even it has moved a lot of stuff into Nuget packages that used to be built in. For example, the entirety of ASP.net.


> The discussion is about large standard libraries vs. package managers. C and C++ are even older languages that had neither. Back then, the idea of having a large abstraction layer between the language and the OS wasn't even thinkable -- that idea didn't really appear before the 90s.

Agreed, but this is my point. I don’t think the distinction between “20th century vs 21st century” is particularly meaningful, but I’m responding to the assertion that 20th century languages are characterized by large standard libraries—how can that be the case if standard libraries didn’t really exist apart from a few languages in the very final years of the 20th century?

> If you don't know the size of Java's standard library in the 90s, it's maybe risky to imply that it must have been small. :) Kidding a bit. But to give a hint, it included two different incompatible GUI frameworks by 1998.

Fair enough.

> JavaScript is from the 90s, but wasn't a standalone programming language until Node.js, which was released in 2009.

No, but it had a standard library (browser APIs) long before NodeJS.

> C# is the newest non-Go language you mention with a large standard library. It was released in 2000 (so in the 20th century if we're being pedantic), and even it has moved a lot of stuff into Nuget packages that used to be built in. For example, the entirety of ASP.net.

But enough remains in the standard library for it to be considered “large”.

Anyway, if there are only a handful of 20th century languages with large standard libraries (all clustered around the turn of the millennium) and seemingly only one mainstream language in the 21st century with a small standard library (Rust), I think the original claim that Go is the exception among 21st century languages is incorrect.


You’re bending logic to the point where I have a very hard time assuming you’re arguing in good faith.


I assure you I’m not the one bending logic. Good day.


The idea was surely thinkable, just not for C, whose UNIX was basically its standard library.

Hence why everyone kind of expects POSIX to compensate for the anemic ISO C.

While ISO C++98 was built on the same anemic principle, due to its relation to C and UNIX, all major C++ compilers had rich frameworks, which unfortunately WG21 ignored.


But C isn’t tied to POSIX any more than it is tied to the Windows API. The standards working groups weren’t stupid, they just tried to specify things in a way that wouldn’t force an inefficient UNIX emulation layer on everyone else.


Yet we ended up with POSIX everywhere there is a C compiler, with exception of hardware that is worse than a PDP-11.


Even still, posix is pretty anemic by the standards we’re talking about.


Indeed, although it doesn't change the fact that C kind of needs to bring UNIX for the ride despite its portability sales pitch.


Agreed.


Disagree.

The go stdlib has the compatibility promise. Managing ANY external dependencies is exactly what causes breaking changes to be introduced over time. Think of it as a long term support plan on all your packages.

python had a HUGE stdlib, which was its downfall. Go's stdlib is quite small in comparison. I'd say the kitchen-sink batteries included is the 20th centry idea, Not the approach go has taken.


> My other big beef with Rust is its lack of a comprehensive stdlib, unlike Go.

I think many of us are feeling these days that the rust standard library might already be too large and it's better to be very conservative adding stuff there or only after an extended period of time of independent vetting. I have yet to not experience that standard libraries turn into very stale and outdated things relatively quickly that carries an enormous maintenance burden.


> only after an extended period of time of independent vetting

Let's face it, what will your average dev do in the absence of something being in stdlib ?

They will (semi)randomly pick some third-party crate which in all likelihood is totally unvetted.

The crate that they selected could equally turn stale and outdated quickly if its a small one-man-band Github party (a not unfamiliar story these days).

Hence, having a standardised baseline available in stdlib is no bad thing IMHO. It brings "sane defaults" and forces anyone wanting to do different to make a thought-out decision, not just grab whatever Crate "looks good" so they can get on with coding.


I disagree here.

Coming from Ruby, there are lessons to be learned. Ruby has a large stdlib, like you seem to want. But a lot of stuff in there is outdated. So much, that many, even most, people, projects, frameworks immediately replace them with a community version. Sometimes because they are actually better maintained, more stable, faster, cleaner and more secure than the ones in stdlib. A similar thing is seen in Ruby-on-Rails, which adds features and even entire frameworks with every release, but which many people swap out for their own preferred lib almost immediately.

This is inevitable and obvious. And, IMO should be embraced: deprecate them from stdlib, and leave it to "the community".

> They will (semi)randomly pick some third-party

Again good lessons can be learned from Ruby (gems) here. People will pick a "random" one only if there are no canonical, defacto-standard libs (gems) out there. But for some gems there is a clear, singular goto option and everyong will pick that, turning it into something of a stdlib. A good next step would, IMO, NOT be to then pull those libs into the stdlib, but, instead, to make the "not-fully-stdlib-but-also-not-random-community-provided" into a distinct category of libs.


Ruby standard library is such a mess. It had like two YAML serializers at some point, right? dbm and gdbm (whatever they are)? Multiple prime number generators? Rinda (what the hell is this)? RSS? Xmlrpc? So, they had time to put in all that crap while failing to design a usable HTTP client.


> Coming from Ruby, there are lessons to be learned. Ruby has a large stdlib

And is very aggressively moving stuff out of stdlib to three concentric circles of gems; “default gems” which are almost like stdlibs (and still tied somewhat to Ruby versions) but are separately maintained and can be updated between Ruby versions, “bundled” gems that are usually distributed with Ruby, and just plain regular gems.


Indeed. And this is one of those "lessons learned".


> They will (semi)randomly pick some third-party crate which in all likelihood is totally unvetted.

That's okay though. It's better for some devs to (temporarily) pick some bad crates than to permanently include some broken legacy APIs in the standard library that need to stay there due to compatibility issues. Even Python 3 couldn't get rid of a lot of code that just should not be in the standard library.


I disagree quite strongly. A bad stdlib API can be deprecated; a bad crate bitrots. You won't see the situation so clearly at first, but after a few years you will find that an increasing number of useful crates no longer compile without intervention because direct or transitive dependencies no longer exist.


Rust commits much more strongly to backwards compatibility than a language like Python. Deprecating something in the stdlib would be a big deal.

The fact is that in practice, for most “standard” functionality that isn’t in the stdlib, there are well maintained, well vetted crates written by well known authors (or even developers of the stdlib). These are not hard to find, either by asking around or looking at what other crates use.

Rust’s stdlib is consistently of high quality. I can use everything in it without worry or concern. I can’t say the same about Python.


Rust has already deprecated stuff in stdlib, though. It gets tagged as deprecated, and an alternative is generally provided, but the deprecated function sticks around so that stuff that uses it doesn't break. Seems like the best solution to me.


Yeah this is true, but to my knowledge that’s never been done since 1.0 with an entire module, just certain functions (could be wrong, but I have never seen it).

Keeping deprecated functions around still requires some maintenance burden, because you can’t break them just because they’re deprecated. That maintenance burden might become pretty extreme if we’re talking about something as complicated as an HTTP server module, as some folks are asking for in this thread.

I also personally don’t want to see the standard library littered with failed and deprecated experiments. In addition to the maintenance burden, it adds confusion when you’re learning the language or trying to discover how to do something.


> The crate that they selected could equally turn stale and outdated quickly if its a small one-man-band Github party (a not unfamiliar story these days).

Yes, but the crate dev likely offers no maintenance guarantees.

A bad stdlib had to be maintained forever (and potentially divert new users from better options)


I mean, look at Java Futures and Streams API. Like, who the hell designed this crap? Didn't they have C# and Scala to rip-off? And it's stuck forever.


It doesn't have to be.

Code can be deprecated but it is often a long drawn out process.


Python has been at it for most of my career.


> They will (semi)randomly pick some third-party crate which in all likelihood is totally unvetted.

It doesn't have to be that way, though. Why can't there be first-party crates? If the Rust core team is adding functionality to stdlib why can't they add functionality to a crate instead? Then you'd have a collection of first-party, vetted crates for devs to choose from.


The difference between having something in the standard library vs shipping it as a crate is evolution not vetting, you can have the same people develop/vet a third party crate the same as it would be if it was a part of the standard library.

The big difference is versioning (eg. you can have a crate that works on old compilers/stdlib, you can pull an older package version to new compiler/stdlib) and maintenance burden.


> They will (semi)randomly pick some third-party crate which in all likelihood is totally unvetted.

That this is the norm for a language that focuses on security is mind-boggling.

Didn't we learn anything from the NPM mess?


I really like this tweet[1], which is bit cheeky, but gets also at the issue that everyone "basic" needs are actually different.

[1]: https://twitter.com/steveklabnik/status/1367854800193413126


To be fair, C++ has std::map and iostream/printf, so two of the examples given here as "in Rust's standard library and not in C++'s" are wrong.

Now there are reasons many people choose not to use the built in C++ std::map, and very nearly this ended up the case for Rust's HashMap, until they switched the implementation to the crate that was becoming popular. But even there's design decisions, like Rust using more compute to have a strong hash to avoid DoS attacks, that some may feel are inappropriate for their use cases.


You probably mean std::unordered_map, not std::map. std::map is a red-black tree, so it's akin to TreeMap in Java or BTreeMap in Rust. (It really is a shame that the smallest name for an associative array in C++ is for the version that is the least useful).


I wish there was an `Either` type in std. I realize that there used to be one and we have `Result` now. However, now that we have `impl Trait` it's worth revisiting, I believe. If we don't we'll have the one from `itertools`, the one from `futures` etc.

In most GCed languages it wouldn't matter so much because you'd return a boxed `Iterator` or `Future` or what have you. But in Rust you generally want to avoid the allocation.


IMHO most of the time it's better to create a special 2-case enum for your specific use case. It's 3 lines of code, which gives you significantly clearer naming than "Either" and "Left/Right".


I ran into this recently and solved the problem via Enums. Not sure of the performance impact.


I think most people end up regretting code that users itertools::Either at the moment and use it reluctantly.


Can you expand on that? As long as the public type is opaque I don't see what there is to regret.


As someone with difficulties telling left and right apart, using Either isn't particularly straight forward. This plus the left and right variant become meaningless on their own.

Ie, if I wanted an enum of either a string or an integer, it becomes "Either<&str, i64>".

But what I wanted is that it's either a string identifier or the database Id of something to be later referenced, which might be better described as "enum IdentOrDbId { Ident(String), DbId(i64) }".

This is of course a much simplified example of things I've faced.


Sorry, based on the replies I'm getting I did not make myself clear. I only want to use Either in cases like the following.

    fn some_func() -> impl SomeTrait {
        if some_condition {
            Either::Left(a)
        } else {
            Either::Right(b)
        }
    }
Both a and b will implement SomeTrait which is all callers care about. However, because they're structurally different they must be wrapped in an Either that delegates all methods from SomeTrait to a or b respectively.


I don't think that would work in general: trait methods can have signatures for which you can't synthesize an implementation for the sum type (something that takes a second Self as an argument, like std::ops::Add, comes to mind).

For object-safe traits where this would be possible, you can at least do this at the cost of an allocation, as you probably know:

  fn some_func() -> Box<dyn SomeTrait> {
      if some_condition {
          Box::new(a)
      } else {
          Box::new(b)
      }
  }


The moment you have an Either::Left(Either::Left(x)) you're starting to revisit everything. Yet that's what happens with impl Iterator a lot when you have branches.


Oh, what will people do if there's no library for their little snowflake problem?

How could you ever possibly, gasp, write it yourself? Oh the horrors of computer science.

Seriously, many times I find myself needing only a fraction of some provided library.


They will be unproductive. Yeah, it's fun to micro-obsess over some hand-rolled code. Save that for side projects, or maybe you can do it if you're dev cog N+1, N > 20.


I'd be interested to know the distribution of sizes of organizations that write software. What proportion of all software development orgs are < 5 devs, < 10 devs?


> needing only a fraction of some provided library.

In a compiled language like Rust, that should not matter, isn't it? The binary-size, performance or attack-surface should not increase from code that is unused as it won't be in the resulting binary. Or did I understand it completely wrong?


> I have yet to not experience that standard libraries turn into very stale and outdated things relatively quickly that carries an enormous maintenance burden

Maybe. I think .NET's core libs are excellent. So are Go's. I'm sure it's a maintenance burden, but man... It's so nice to be able to build a project with maybe 1-2 dependencies instead of 100s (as in my typical Node and Ruby projects, if you count transitive dependencies). It's hard to overstate how much of a strength I think a good stdlib is.


Golangs stdlib is very nice to work with.


I don't disagree, and so was Python's standard library in 2003. The issue with standard libraries is that the moment they move past algorithms, data structures and basic IO they are often not lasting for more than 5 years.

Even the Go standard library is quickly aging as contexts were not around when it was first created. It will be interesting to see how Go's standard library is going to evolve with Generics and more reliance on contexts.


Python, the language, is easy to learn. It's stdlib is not.

I agree that we need a serious refactoring of Python's stdlib.


> serious refactoring of Python's stdlib.

My hands start to sweat at the idea of this becoming The Python 4 Debacle.


Language semantics != stdlib.


It doesn't matter why a new version would be very backwards-incompatible. Whether that is language semantics (2->3) or deprecation and replacement of stdlib (3->4 ?)


> Even the Go standard library is quickly aging

If anything, I found go stdlib to age like fine wine. It doesn't do everything under the sun, but what it does it does very well and can easily be extended if needed.


I think the argument was that Go is still too young to make that argument in the long view.


Well, the time threshold given above was 5 years. It's been 10 since 1.0.


Yes, and it makes sense for Go to have a stdlib like this, but does it make sense for Rust? You wouldn’t use Rust unless you were aiming for low level optimizations all the time, making one size fits all standard libraries less appropriate.

There is no need to make Rust like Go, Go is already a great Go.


This has nothing to do with the properties of Go/Rust as languages, but all about organizational structure. Go is supported by Google who can put necessary resources to include a bunch of high-level batteries in the stdlib and keep it coherent. Rust community does not have such resources and even if there are some big players joining the community, they don't own Rust and won't be able allocate resources for continuous support without a proper org structure that allows this to be sustainable and efficient.

Philosophical reasoning like "Rust should have small stdlib" is horseshit. Ideal language has every function perfectly designed in the stdlib (kinda like PHP in 2003, lol). But to get closer to that dream you need some sustainable organization that maintains and expands that stdlib, wishes to do so and has resources to do that continuously.

If it's just random 5 people doing maintenance and hacking at their spare time, that's not enough for an expanding library of diverse code. That could be enough only for keeping very core features working/improving like const generics and borrow checker.


> Ideal language has every function perfectly designed in the stdlib

That way of thinking only makes sense if nothing ever changes. You even bring PHP as a good example here of how this fails fast. The world has changed greatly since 2003. The Python standard library carries three XML implementations and an outdated HTML parser irrelevant for the modern world. These libraries were state of the art when they were written, but it's the world around it that changed.

Now one can make the argument: why was it not modernized and updated? The simple answer is that people started building code against this and now depend on it working this way. The only way would be to completely rewrite them with new APIs, remove stuff no longer needed but then you're effectively back to just using crates.io and the likes. Except it's worse now because the standard library is a monolith and does not compose.


On the other hand, code using those libraries still works, whereas code trying today's favorite parser will not work after community moves on and stops maintaining it.

I agree you don't want everything in the stdlib, but havinga stdlib way to achieve something is pretty valuable, even if it's not the best way and never was.


Why would code dependent on third party libraries break? It’s not like crates.io is going to stop hosting them when they become unmaintained. Barring security vulnerabilities, a library that does a well encapsulated task like parsing json might work just fine without updates for ten or more years.

In cases where libraries don’t keep up with changes in standards or whatever, yeah, you’re going to have to find another library. This is a part of our job that is pretty much unavoidable no matter the language. We should learn lessons like sticking dependencies you’re not certain about behind an interface to minimize the cost of switching if possible.


> Barring security vulnerabilities, a library that does a well encapsulated task like parsing json might work just fine without updates for ten or more years.

JSON library updates are a constant tedious chore for many in the Java world. Jackson and Gson have both had several we've needed to make over the years. Now, _most_ of these are due to non-default features where you allow it to deserialize arbitrary types that we don't use. But clients big enough to insist on a BOM and internal security teams both just see "Jackson version x.y.z has a RCE on it, you need to update", even if it's literally impossible to trigger without changing the application code to enable insecure features.


You’re right, a json parser may have been a poor choice. Perhaps something like a UUID generator, or a parser for some more well-established and stable text format would be a better example.

On the other hand, something like a json parser needing to change all the time to support the ecosystem is another good reason to keep it out of the stdlib, IMO. There are limited hands to work on Rust itself, and I’d rather they spend their time making improvements to the core of the language, stabilizing features, and so on. As a user of rust, I’m willing to pay the cost of finding a good json library if it means I don’t need to update rust versions just to deal with some issue in JSON parsing.


At the same time, code that is "one and done" like a uuid generator is a much lower maintenance burden for the standard library.


Very true! I think the philosophy of letting external crates handle it and seeing how stable they are before considering inclusion in the stdlib is a good one here. Maybe in a few years we see that the uuid crate never needs updates and is super stable, so someone opens an RFC to pull it into std.

The thing is it’s easy to be wrong if you’re trying to predict up front what is going to wind up being stable and low maintenance and what isn’t, so the precautionary nature of what gets into the stdlib I think reflects that.


Lol, I do remember how an upgrade of a JSON serializer (JSON.NET in that case) broke an application. So, there was a .NET service publishing events to a queue with a bunch of consumers - a bunch of Golang service and (our) Scala service. One day our service starts crashing cause it can't parse the payload. We asked developers of the upstream service if they had change anything and they said "no, and by the way, other consumers are doing fine".

Turns out, they upgraded JSON.NET and it started to name fields in different case. Golang parser didn't really care about snake_case, camelCase or PascalCase but whatever parser we were using in Scala at the time actually did.


Two thoughts, as a very recent convert from expert-level Go to all-around-the-place-level Rust:

(For some context, interested in Rust since 2009, similar as with Go, but not involved at all, whereas contributor to Go pre-1.0.)

First thing, is that I was frustrated with lack of simplicity vs. Go for a long time as well. Finally, recently I got it: AFAIU it's a case of different priorities. In the triangle of simplicity-safety-performance, Go loves all of them, but in case of conflict, chooses simplicity (thus, most notably/infamously, we get GC and nil). Rust also loves all of them, but in case of conflict chooses safety AND performance. What's freaking crazy IMO is the "AND" here, which I see as a huge thing. And it instantly also explains to me a tension in the Rust community between safety and performance camps, infamously most visible around usage of the 'unsafe' keyword. Yet in a barely-managed marriage between safety and performance, simplicity is the heartily beloved friend, who still just gets silently shown out of the room when a serious row starts between the couple, until they're ready to open the door back, and with patient smiles listen to some friendly suggestions.

Secondly, I bumped my head into a wall a few times until recently I tried the O'Reilly "Programming in Rust (2nd ed.)" on some recommendation from Reddit. The "2nd ed." part is important here, because it covers a few relatively recent important additions to Rust, that I found I was teeth-gnashing frustrated without before (coming from Go), namely: anyhow, thiserror, and (to a lesser extent) async. Edit: Two bonus personal recommendations for the book: (1) I got a job in Rust after going through it and then testing my abilities on some personal project, where I finally felt I'm managing to do regular non-fancy-shmancy (i.e. web, sqlite, GUI via iced, but no macros and avoiding lifetimes whenever possible - if curious, the project is: https://github.com/akavel/backerrs) coding, and can work around lifetime issues (somewhat faster or slower but can, and often just falling back to .clone()/Rc); though in all honesty, they kind of didn't really test my Rust skills, to my huge surprise, and trusted my smarts shown in Go & C++ experience + passion for the project; (2) I'm carrying the book with me to workplace every day and definitely checking things in it from time to time.


I adhere very much to your first point. This made me think that from the opposite point of view the triangle becomes complex-unsafe-inefficient. And of those three "qualities" a program can have, complexity is the only one for which i don't have a tool or a methodology.


Bryan Cantrill has an interesting talk that discusses various languages in terms of their values that you might finding interesting: https://www.youtube.com/watch?v=2wZ1pCpJUIM


Had you previously read the No Starch book or did you start with the O'Reilly one?


I only tried the online "official" Rust Book before (and that one didn't work for me unfortunately), didn't try No Starch.


I learned Rust only to abandon it afterwards. It's not worth the effort, especially if you already know Ada and various memory-managed languages like CommonLisp and Go. Although I'm sure sure these exist, I've never written a program that would benefit from not having a garbage collector.


If your program benefits from having a garbage collector (GC) — which I guess is a lot of programs — then by all means go for a GC-ed language. It's one less thing to worry about.

However, the garbage collection process itself sometimes introduce a noticeable pause in execution, which can be unacceptable in certain applications. Especially in critical real-time applications.

(That GC pause won't be a problem if your program mostly just waits for user input the majority of the time).

For an extreme example, you don't want your car's airbag controller to be paused for garbage collecting during a crash.


I agree, of course, there are some domains where garbage collection can be problematic, most notably game engines and DSP. High integrity applications and hard real-time systems like the airbag controller you mention usually go a bit further, it is often forbidden to dynamically allocate any memory, or at very least it may only be requested once at the start.


Pause-free GCs have existed for years, for what it’s worth. They do still have some CPU overhead in many cases, but it’s a question of throughput, not latency.


They also introduce (sometimes quite large) memory overhead.


The ONLY lib missing from stdlib. Is the time library.

Maybe unified interface to work with DB.

Cryptographic functions while useful age poorly, and require constant change.

Serde while extremely useful won't fit every usecase. Sometimes you want something smaller or faster.

Http(s) clients also tend to age poorly. Python also had http client/server in std lib that aged like warm milk.

For a thing to be in std, it needs to be imo something you don't want implented in several different ways. Something that is essentially solved and will change very little over time.


Certainly not DB handling. Databases have vastly different features sets and tightly coupled to I/O abstractions, so it's not uncommon to call database specific functions from a driver. Maybe a very stable first-party crate, but not stdlib.


Unicode handling. There are two big projects on that front in the crate space, but this is, for me at least, the biggest gap in the std lib (but then I'm writing a text processing/formatting system so it matters a lot to me).


I don't think proper Unicode handling is a thing that can be solved universally. Do you need it for formatting, to count characters, for currency extraction, time formatting, etc.


At the very least some basic things like collation and normalization should be part of the String class. I think character counting would also be a useful thing (and given the ASCII-art formatting of error messages in rust, it's presumably part of the compiler's code already although there are plenty of bits of compiler code that aren't exposed through stdlib like the ability to take a string like \u{e0) and convert that to á.


And error handling, async runtime,...


I thought that was being added (error handling).

Async runtime is neat, but it tends to be more trouble than it's worth. And I don't think any design has emerged as universal way to deal with it.

Time however has/should have a single representation.


It is being added, but not yet there in 2021 edition.

Without a proper async runtime it is very hard to write portable asynchronous code, as each library is tied to a specific runtime.


> Having a decent, well maintained, stdlib that contains "the basics" that one would expect in any 21st century programming language is sorely missing from Rust, and from my personal perspective is something that would draw me back to Rust for another go at learning it.

What would you like to see in that? Genuinely curious, I've considered putting something together that's just a meta-crate that pulls in some others.


I would suggest that an opinionated website documenting a curation of crate choices might be better than a crate (which would be a pain technically). I’ve been considering making one if you’d be interested in collaborating.


Happy to discuss. I'm sick so I'm being extra terrible about responding to emails (and I'm about to go to sleep at 6am -_-) but feel free to reach out. I could see a website being much simpler to maintain.


I was actually thinking as a learning exercise, to go through python docs and create a rust book covering the needs of a python developer with different crates. If the book is versioned you can easily remove and add new crates. But I've been busy at work...


I think serde should be part of the stdlib. A http server and client library is also really a must-have today. Maybe some abstraction for SQL database drivers.

Some other candidates include: logging, anync runtime (Tokyo), random generator, regex


Let me give you some contrasting views on this: we already know that serde has (serious) limitations and sooner or later a completely new version will be needed. HTTP and Client library will not make sense unless async is also in the core. Moving async in the core is going to raise serious problems for people trying to port this to new ecosystems and there is not even agreement on exactly how async should continue into the future.

In fact, there are people (like me) who are very happily pushing back on moving stuff into the standard library unless it's clear that the code there can live for 30 years. And I do not see this for almost all the things you just mentioned.


What are Serde's serious limitations?


You're constrained by the internal serde data model. For instance serde has no bignum in the data model, so you can't express it in that data model which requires awful workarounds. I went into the types of abuses necessary here a bit: https://lucumr.pocoo.org/2021/11/14/abusing-serde/


What being a part of standard library makes the library better?


> What would you like to see in that? Genuinely curious

I see someone else has got there before me and my answer would not be dissimilar.

Stuff to be able to talk to web services (because let's face it, any modern software will be doing that) is the obvious low hanging fruit, i.e. HTTP, JSON, Crypto etc.

Beyond that (if its not there already) basic tools to enable out-of-the-box CLI writing (e.g. flags etc.)


> Stuff to be able to talk to web services (because let's face it, any modern software will be doing that) is the obvious low hanging fruit, i.e. HTTP, JSON, Crypto etc.

The problem with adding this is that in 10-20 years it's not so useful anymore. Then you'd have:

* an outdated HTTP library (look at how "everyone" is using a 3rd party HTTP library in python)

* a JSON library no one uses (everyone used XML, now it's out of fashion. And JSON has obvious problems!)

* Old cryptographic standards that you'd rather not use


>"The problem with adding this is that in 10-20 years it's not so useful anymore. Then you'd have:"

Perfect reason for not writing any software. None would work for 20 years without maintenance (there are some exceptions of course).


No, it’s the perfect reason for not pulling software into the standard library, which has strong backwards compatibility guarantees and cannot be as cavalier about changing interfaces and deprecations as external crates.


A Collections grouping of common data structures and macros would be useful.

There are so many different idiomatic ways to instantiate a type in rust that a crate with a common calling convention for most data structures would be very useful.



Sibling nailed it but I can’t reply on that post. There are variations such as Vec!, map! Macros collect_vec, collect_map etc.

It would be great to have a unified interface that exposes a common set of instantiation mechanisms and apis. Other lib authors should be able to implement this api and get the same calling conventions.


I think they want something like a ‘map!()’ macro.


Personally I would like a standard http library that has async and non async methods. One of the things that initially turned me away from Rust was the complexity of finding a library to use for simple http requests. I know this is also a strength, but it makes the intro days incredibly challenging


I wonder if that's a discovery issue. Having used rust for years the quick answer would be 'reqwest', but I guess there's not really any way someone would know that just jumping into the language.


Crates.io is great, but it certainly has a discoverability issue. Even knowing about reqwest it's far from trivial to conjure up a search that yields reqwest on the first result page. And I'm sorely missing features like "crates that use this also use" and "instead of this crate, others use".

If I had infinite time on my hands it would be a fun exercise to try to improve on it.


I disagree that is hard to find this.

Searching "Http request in rust" for me yields the the rust cookbook on the first result, which is using reqwest! (https://rust-lang-nursery.github.io/rust-cookbook/web/client...)

Regardless I agree that crates.io could have some more features but I am not sure how useful I would find a recommendation engine.


Right, but isn't it a bit of a shame that I search "http" in crates.io, the site that theoretically should be serving the "find a crate" use case, and I don't get suggestions that are useful?


there is lib.rs that I find much more efficient to discover crates.

Requests is currently #6 in its Web programming > HTTP client category[0]. The download counts, number of contributors, version history, and dependent crates is also prominently displayed.

[0]: https://lib.rs/web-programming/http-client


good json parsing is important these days. crypto is a must too. golang's crypto is really good (ssh, tls, sha, md5..) Right now in rust we rely on rust bindings to c libs.


I disagree about golang's crypto being good. It includes insecure functions like md5 and sha1 that have no place in a crypto library. They should be in some other part of the library (maybe add a "legacy" section) or just not included at all. The go stdlib crypto is also makes the (common) mistake of exposing unsafe primitives (like raw AES without a mode of operation needed to use it for encryption) at the top level of their various sections instead of a well-documented "unsafe" or "hazardous" section. And it's missing useful well-vetted high-performance options like AES-OCB (which recently went out of patent).


Well sha1 and md5 are used all over the place so supporting them makes a lot of sense.

Most of your points are related to documentation, and a bit patronizing.

For all I care most of the functionality is there, fast and reasonably well designed.


> crypto is a must too. golang's crypto is really good (ssh, tls, sha, md5..)

Indeed. I would almost put my neck above the parapet and say that in the 21st century any programming language worthy of its name should come with decent crypto out of the box.

Crypto is no longer an option these days.


Rust actually tried this. Search for "stdx". For some reasons, it didn't take off. Maybe we should try again.


I found stdx to be a useful set of sign-posts. But it's not been updated in three years. E.g. it suggests `error-chain` instead of `anyhow` and `thiserror`.


The basics of rust really are not that bad. It gets funky once you start diving deeply into types. If you are just hitting an api and getting json out followed by working with storage etc in AWS, the situation is pretty straightforward.


I’d rather see more of the stdlib usable on wasm32-unknown-unknown, or at least wasm32-wasi, than a generically bigger stdlib.


It’s curious to me the abscence of c++ interop as a focus. I understand there may be more important problems within the Rust community and it’s not yet ready to take center stage, but there’s soooo much to be gained from being able to piecemeal rewrite c++ code into rust in a low-friction manner or to make Rust as easy an option to reach for as C++ to write new code for an existing c++ codebase (eg a goal could be that all new code in Chrome/Firefox should be being written in Rust). The cxx crate is neat and a good start but it’s still too immature and ultimately you are limited to anything that can be hidden behind thin interfaces.


Doesn't C++ not have a stable ABI? Doesn't that kind of make it difficult to interop with it? The ABI is an implementation defined thing by the compiler, right?


You don’t need a stable ABI per se if you’re working with source-level interop. Also, while it doesn’t have one in theory, it does in practice on each platform if I recall correctly.


If you consider the compiler to be part of the platform, that’s mostly true.

I think the hard problem in practice is that headers and templates make for a very fuzzy boundary between library code and client code. Calling C++ from Rust already works fine for traditional OO code, but it’s hard to imagine that anything could ever use modern templated C++ code that isn’t a full superset of C++.


If I'm not mistaken you can mix'n'match shared libraries built by either compiler on Linux. I think that's also true for individual object files within the executable. Same is true for Windows. I can't recall the state on Darwin & I don't know the state of other platforms.

I don't think templated C++ code needs to be a full superset of C++. For example, cxx provides bridged implementations for CxxVector. If cxx provided an easy-to-use mechanism to define your own bridging type without a lot of boilerplate, that could open up far more interop than is possible today.


Isn’t this as much a problem with C++ as it is with Rust? Both tend to need to drop to C for FFI. So you’d need a C++ stable ABI as well as one for Rust.


Look at the cxx crate. I don’t care that there’s a C FFI in the middle. Just that it needs to look like magic. In other words, most or all c++ types, including templates, without any real complexity on the part of the user. Conversely Rust types are accessible from c++ without complexity. I think the latter part is mostly true although some patterns need some TLC. The former is still extremely immature.


I have Steves book on paperback. Haven't had time to read it all and given the rate things are moving now I am afraid that some of it could be outdated.

As a developer, what I really want to hear is that Rust 2022 will be basically Rust 2021 with some bug fixes and performance improvements.

Let's hope we get there next year.


Isn't that the goal of the Rust editions¹ that are released roughly once every three years? Rust had a 2015, 2018, and now a 2021 edition, so if that pattern is followed you can probably expect a new edition in 2024.

1: https://doc.rust-lang.org/book/appendix-05-editions.html


The goal is not every three years. It's whenever people feel like it, but there's been discussion about formalising it to every three years


Hence the 'roughly' and 'probably'.


Stable Rust 2022 will include all of Stable Rust 2021, just with some more bug fixes, performance improvements and features. If anything from Rust 2021 becomes outdated then that happens because new features or syntax allow a much better way to do the same thing, not because the old thing no longer works.

That happens to most actively used languages. C# has seen some major evolutions in recent years and is 22 years old now, Java is 26 years old and has some major changes compared to what it was 7 years ago, and you can no longer discuss C++ (37 years old) without mentioning C++11 along the way.


Java today looks nothing like Java 5 however it's largely backwards compatible. I hope that Rust isn't afraid of making larger changes to stay relevant. Java 8 is what brought a lot of people back after looking at more functional jvm languages such as Scala


The jump from Java 8 (or earlier) to Java 11 (or later) is quite big though, and non-trivial software is not likely to work without at least updating dependencies and fixing some issues.


To me that's still better than no changes at all. I think the change to 8 was the biggest, after that they are manageable.


Is there any indication that Rust is not the new C++ in terms of language and ecosystem complexity? I just can't bring myself to learn languages like C++ and C# because they are changing at such rapid and complex paces, versus other languages like F#, Elixir, Clojure that are essentially feature complete. I'd like to learn Rust, but it just seems like a huge time investment.


I think Rust has already failed in its mission to become a replacement for the existing entrenched systems programming languages. The reason is simple - it does not offer safety without spending an inordinate amount of up front complexity. In order to replace C/C++ a language must first and foremost be simple to learn and get started with. Even though Go is 90% a rehash of other languages, it's success comes down to embracing the foibles of programmers and our need for tools that can be ramped up simply and quickly.

Rust is perhaps the language that people SHOULD be using, but likely won't. Until someone can offer something as simple and easy to learn as C and also can automatically do the things Rust provides without asking the programmer to give double in mental effort, I don't think we will replace C in a major way.


As a full time c++ developer that just started a rust side project (with rocket.rs) to learn the language, I fully disagree. Coding is actually fun again, and it is becoming increasing harder to code c++ at my job. I constantly get reminded that gb free languages can do better. Damn rust, you ruined c++ for me :(

Edit: Ok maybe a bit dramatic, but I really hoped we would get working modules with c++17 and here we are still years away...


We have working modules with C++20.

Granted the only way to fully enjoy them is being on Microsoft ecosystem, but hey plenty of us are.


I hope that in the future, new code written is C will be seen as similar to using unencrypted communication on the Internet.

It is not about how easy is it to write rust. It is about how, as a C programmer you are going to explain that your code is safe.

I know that as a long time C programmer, I still make occasional mistakes that lead to memory corruption.

In production code, there are still way to many pointer bugs. Use after free, etc. That needs to end.


Oh please. Preaching about safety in a globally inter-connected world devoid of privacy.

Newsflash, it's not a language's job to make software safe. It's job is to make programmer's job as simple as possible: ie. writing new software. If your language fails to do that, people will not use it, plain and simple.

How else could you explain the popularity of languages such as Python and Go?


You'll notice that both of those languages also give you safety by default. It's pretty much only C and C++ that still make programmers write artisanal memory leaks.

I didn't really understand Rust, until I saw a Cppcon talk about how to use STD.unique_ptr correctly, and I realized that the rust borrow checker is just that talk implemented in a compiler.


If your project can be written in Python or Go, then yes, do that. I don't think Rust is trying to compete with Python or Go.

However, if the project needs a low-level language. Then skip C and C++ and use Rust.

Almost from the start, high level languages have been about safety. In assembly, memory is untyped. That leads to way too many bugs. So languages evolved to get more eleborate type systems. Compare for example K&R C with lacked function prototypes with current C that does have them. That saved huge numbers of bugs.

Traditionally, garbage collectors had to be used to maintain memory safety. However, that conflicts with many of the uses of C. So Rust had to come up with something that is both safe and compatible with C.


> Preaching about safety in a globally inter-connected world devoid of privacy.

What kind of logic is that? "Because Google sells your data, I don't have to worry about memory safety bugs in my code." Come on.


How can you explain the popularity of big macs and coca-cola if the purpose of food is to help people build healthy bodies?

Maybe your criteria isn't universal?


We are talking about software here, don't change the subject please. My language preference won't make me unhealthy.


HTTPS took off when Let’s Encrypt (and to some degree Caddy) made it easy to be secure. Rust is more like, I dunno, openssl: easier than doing everything yourself by hand, but not even remotely easy.


Using C instead of Rust is similar to rolling your own crypto because openssl is too hard.

There are easy languages like Python or Go. But they are not suitable for many low-level, high performance tasks.

Rust is competing in the domain of easy to use languages. It is about low-level system code.

Edit: Rust in not competing in the domain of easy to use language


> Rust in not competing in the domain of easy to use language

In a sense Rust very much is competing in this space. Consider the extensions to JS (wasm), Ruby, Python, and others that make it easy to call out to Rust. This is being used for performance critical sections of code (or access to high quality stable network and security focused libraries). This competes directly with that native language but offers a path that is easier than C or other options.

Rust appears to be finding a lot of success in that space.


> Using C instead of Rust is similar to rolling your own crypto because openssl is too hard.

Probably, yeah. At least in a world where openssl is actually so hard that doing stuff yourself seems like a reasonable idea, because theoretically secure doesn't do anything for you if the thing is too hard to actually use - there's a reason PHP utterly dominated webdev even when it sucked for security.

> Rust is competing in the domain of easy to use languages.

Then it loses. If you meant to write that Rust is not competing for easy to use, then it loses separately, because technically superior but too high of a barrier to entry still loses (there is a reason we aren't all exclusively running software written in Ada on OpenBSD).


In what way does Ada have a high bar to entry? You just need GCC to get started with it.


It's not about compiling code, it's about being able to write it. IMO, C is easier to write than Ada. (Write badly, perhaps, but this is about the lowest barrier to entry)


>"I hope that in the future, new code written is C will be seen as similar to using unencrypted communication on the Internet."

And I hope that people stop preaching and would live and let live.


IMO rust is much easier to learn than both C and C++ for anyone who is not already familiar with those languages. I suspect it will pick up adoption merely because people don’t want to deal with C or C++.


The learning curve of Rust is extremely front loaded compared to C or C++, even though C++ might be more complex overall than Rust, that is the issue. C is simple, and I don't think anyone who has read K&R would honestly say it's harder to learn than Rust.


My C class was 12 years ago but I still remember vividly struggling with it on a relatively basic program: figuring out to make the teacher's library work, being greeted with “Segmentation fault core dumped” with zero additional clue, hence needing to install GDB and learn it…

Rust is much much more gentle to learn than C: when you don't understand what a pointer is and how it works, you get an explicit error message given by the borrow checker and you can just ask on the official discord channel for beginners and you've solved your problem.


"and you can just ask on the official discord channel for beginners and you've solved your problem."

If only we had discord 12 years ago... The reason you struggled with C is simple: lack of theoretical knowledge. Your teacher should have started with core workings of a CPU before he showed you C.

Maybe he did, but you simply didn't bother to learn?


One might argue that one of the problem with C is lack of good resources teaching that theoretical knowledge. For example, the Rust book has a great section teaching the difference between the stack and the heap. Most C resources I have tried to learn from in the past have just assumed that I would learn that sort of thing by osmosis.


Theoretical knowledge is aquired at an university or by reading a book on computer science. Then, you apply that knowledge when you use C or Rust. It is good that Rust book teaches about some CS concepts, but that is not a job of a book about Rust and can be considered a bonus.

To really learn about the innerworkings of the CPU, you must read a book about...the innerworkings of a CPU.


> To really learn about the innerworkings of the CPU, you must read a book about...the innerworkings of a CPU.

But C doesn't map to the "inner workings of the CPU", it maps to a certain abstract machine, defined in the C standard [1]. If you think that C maps to an actual CPU, you're already using C wrong, and it probably shows – in the form of undefined behavior bugs.

[1] standards, actually, pick your preferred version


> To really learn about the innerworkings of the CPU, you must read a book about...the innerworkings of a CPU.

That's simply not true. There are all sorts of learning resources that aren't books. I have personally learnt a ton about the inner workings of CPUs by following the /r/rust subreddit for example.

Most program languages are accessible to people without a university degree or the desire to read a doorstop book on the subject. C and C++ are a notable exception to this. I don't know how you can argue that a language is easy to learn whilst simultaneously holding that it requires a university degree.


“C isn't hard, you simply didn't bother to learn” is probably a great summary of why C and all of its ecosystem is really hard for beginners. I'm really glad that the Rust community as a whole accept the fact that Rust is harder than most languages (by that I mean Python, PHP, Java, etc. but not C) and putting a ton of effort in learning material and ergonomic tooling.

Holding this kind of argument in a thread about how Rust is too hard is a bit strange though…


I think that’s only true for very simple programs. As soon as you need to use a dependency C and especially C++ front load a bunch of complexity around build systems, header files, etc. C doesn’t even come with a HashMap built in, so you end up hitting this very early on.


It's not as simple as Python though, most people who are writing C++ are doing so because they already know it (e.g. from school or past work) or they have no option (building on an existing C++ codebase). Being in the position of "I need to learn a first systems language, should I pick C++ or Rust" is not very common, IMO.


I don't think anyone is claiming that Rust is as simple as Python. Nor is Rust really targeting Python users. Java isn't as simple as Python either, but nobody thinks that makes Java unsuccessful.

> Being in the position of "I need to learn a first systems language, should I pick C++ or Rust" is not very common, IMO.

What makes you think that. Everyone who learns systems programming is in this position at some point. Perhaps not if they learn at university and the language is chosen for them, but that just pushes the choice onto the university itself. And universities are increasingly offering Rust courses.

Of course, plenty of programmers aren't systems programmers and never will be. But that doesn't negate the need for systems programming languages, as demonstrated by the continued use of C and C++. If you do need a systems languages then Rust can make a good case for being the simplest. And could well become the de facto standard over the next few years.


That’s all reasonable, I just feel like there’s a bit of a critical mass needed to really enter the mainstream as a “de facto standard” and I don’t know if the activation energy on Rust is too high to reach that critical mass

Rust to me is starting to feel like how Haskell was a decade ago, where it seemed like everybody was talking about it but serious adoption was always just on the horizon.

But maybe my expectations are just too high. It seemed quite exciting at first to see everything being rewritten in Rust but I guess that’s not how the software world really works. Old projects are entrenched because they take many years of dev time to mature


> Rust to me is starting to feel like how Haskell was a decade ago, where it seemed like everybody was talking about it but serious adoption was always just on the horizon.

I think you might just not be aware of how widely adopted Rust already is. Google, Microsoft, Amazon, Dropbox, Cloudflare to name just a few companies already have important projects in Rust. Curl is adding a Rust backend. The Linux Kernel hasn't committed yet, but groundwork has begun and it's looking like it's going to happen. Serious adoption has already happened.

You're right that we won't see mass rewrites (although there have been some). That's just too big a project. But new projects are increasingly choosing Rust. It'll be a decade before Rust is the de facto standard if it happens. But personally I consider that quite likely.


>"But that doesn't negate the need for systems programming languages, as demonstrated by the continued use of C and C++"

C++ uses far exceed systems programming.


The point of rust isn't to be simple. It's to prevent all the C++ people from whining that all the higher level languages have GC. In truth, 95% of them should just be using a language with GC (i.e. C#/Go/Python), and the other 5% that claim they understand how unique_ptr works should have no problem switching to rust, because they are already running an implementation of the borrow checker by hand to verify that their code doesn't have bugs.


For most of my projects, I would switch from Rust to a GC language in a heartbeat, provided that it

1) is as safe and as fast as Rust, or at most 1.5x slower

2) offers "fearless concurrency" and easy, safe, and efficient parallelism

3) has sum types and a good static type system

4) has great tooling and package management (see cargo)

5) has a vibrant and growing ecosystem

So far, that language doesn't seem to exist. Most fail at points 1+2.


The new concurrent ocaml is improving in all of them.


For me, Haskell is already there (maybe you find its ecosystem not vibrant enough?) and OCaml will be there with multicore OCaml.


I accept that Haskell is that language for some. As for me, I tried Haskell twice, and gave up each time:

1) reasoning about performance is just too difficult (due to lazy evaluation)

2) the community seems too focused on providing ever more abstract abstractions instead of practical solutions to actual problems

3) purity and freedom from side-effects cause a lot of friction and make many simple problems really hard; the libraries and patterns to solve that problem are too complicated for me to comprehend, so maybe I'm too dumb for Haskell (though I don't have any problems with Rust, so...)


Swift and Kotlin are partial matches. Sadly, that’s about it.


>"It's to prevent all the C++ people from whining "

That's some very thoughtful and noble goal. Maybe people who use C++ (and bunch of other languages) are not the ones whining.


Yes python was easy but I've never really bothered with C++. It looked so hard and scary. But Rust gave me more guarantees. It was easier to start learning and do something with it.


Yet nobody has ever written & published a production grade persistent key-value store in rust. It's puzzling.


I thought tikv was supposed to be this, but perhaps I misunderstand.


Tikv uses rocksdb which is not written in rust.


You casually mention C++ along with C but you can't really put them in the same bucket. C++ is terribly, mindbogglingly, utterly, complex, its main selling points being zero-cost abstractions and compatibility with C. Perhaps we have different experiences but I don't see Rust as complex to learn with respect to C++. In C++, coming from C, the syntax is deceptively similar but you have to track in your head where the memory is, and seemingly simple operators like "=" or "()" can mean a whole lot of different stuff. Not to mention the craziness of the more advanced features.

Rust is first a system programming language with zero-cost abstractions with the least possible pain, then it is a safe language, way safer than C++.

In any case I am following the community and I have a hard time finding people saying C++ is nice for them. There are few people who have been coding C++ for 20+ years and "love" it but they are far and between. Almost nobody tried both Rust and C++ and continued using C++ for new projects.

So I predict Rust won't replace C, they have different use cases, but it will make a dent in C++.


There is one caveat though, as long as Rust depends on LLVM, it won't fully make it.


> as long as Rust depends on LLVM, it won't fully make it

Why won't it? I haven't really used rust or LLVM before, so I am genuinely curious.


LLVM is written in C++, so any change of Rust compiler that requires changes in the backend requires C++ knowledge, and naturally dependency on a C++ compiler to get the rust compiler binary.


Ah, I see what you mean now. The follow up question I have is: so what? Python is an extremely popular language that is built on C/C++. Unless you are a contributor to the rust spec, why would it matter what language the compiler is written in?

The way I take it, the rust compiler team can either spend their time trying to get to being self hosted, or they can spend their time making improvements to the language.


Because one of the Rust goals is to improve the security landscape and reduce the use of C and C++.

It is a bit hard to achieve such goal if improving Rust requires to some extent, to keep writing C++.

Naturally some day the compiler might be fully bootstraped, however that implies throwing away the work being done by legions of C++ devs on LLVM (same applies to be front ends being written to plug Rust into GCC).


Yeah, it would be nice if the compiler team was dog-fooding, but we need to be pragmatic don't we?


Indeed, hence why many need to understand that Rust vs C++ isn't a zero sum game, and any improvements to C++'s own security story are more than welcomed.


Excellent point!


Rust isn't nearly as hard as it seems. Most people who think it is hard went up against it once or twice for a week or so and gave up. What they didn't realize is unlike some other hard languages (ie. C++) once you are over the initial learning curve, it is pretty smooth sailing, and it codes quite fast - almost like a higher level language. It has a very nice payback for the effort I think.

I recommend the 2nd edition Oreilly book + writing lots of code for about a month or possibly two. After that, I think many will agree it isn't nearly as hard to write as they initially thought it would be.

The biggest bonus: Writing Rust is very fun. I can't pinpoint why this is, but I totally understand why it keeps getting voted favorite language. A total joy to write.


There is a way in which Rust reduces mental effort: a compiled program is more likely to work in the way you expect. It cuts out a swathe of post-compile testing and prodding, or at least, it does in my experience. I can't speak for the other people who have voted it "most loved language" six years running, but that's one of the reason why I love it.

It was the phenomenal Rustlings course that got me over the learning hump.


I continue to be surprised by this despite logging some good number of Rust hours. Cargo run and it just works as expected even with complicated data sharing code etc.

Of course most of the work gets done by the compiler safe guarding and guiding during development. People make noise about lifetimes etc but once you are ready to pay the penalty of Arc (and in most cases if you are using lifetimes you are already on the verge of using Arc). Same with using cloning as needed. End of the day, these incur a predictable and acceptable performance penalty than many other languages do while providing good safety.


> something as simple and easy to learn as C

C is neither simple nor easy to learn. Incorrect and buggy C is simple and easy to learn.

If you think C is "simple" and "easy to learn", your C code is almost certainly full of undefined behavior bugs and security holes.


Testing for off-by-ones considered hard. Gotcha.


Indeed, otherwise the CVE database wouldn't be full of exploits due to off-by-ones memory corruption issues.

By the way, it is sooo easy to test for them,

https://googleprojectzero.blogspot.com/2021/12/this-shouldnt...


You're trying to make a joke, but the flippant dismissal of all the UB is pretty much exactly what they're talking about.

Writing C with a "how hard can it be?" attitude is a great way to get a CVE named after you.


Writing software, especially systems, low-level software is hard. Writing C the language, is not hard by itself. You can't confuse problem domain with language's percieved shortcommings.

Rust won't make systems programming any easier, it will still require the usage of unsafe part of the language. Only, you will also have to deal with a much more complex langage than C. That is why Rust has failed to replace C and C++.


In C you find

- types of varying width (how big is an int?). Rust solves that

- null terminated strings. Rust's are saner

- what happens in an overflow? What are implicit casts? Rust clarifies those edge cases

- standard libraries often use NULL to signal error, no actually -1, no actually non-NULL. Rust has Result

And I haven't even scratched the surface...

All those cases are pretty basilar and happen everywhere. For me it's clear that C is much more complex than Rust if you think about the edge cases. All those safeguards are vital.

But yes, Rust is a much bigger language than C so it won't replace it for really low level, scarce resources tasks. For that look at Zig


Rust was created to replace C++ in Firefox. Today, lots of Rust code in Firefox, replacing C++, and no Go code in Firefox. Therefore, for this goal, Go failed and Rust already succeeded.


Not a good example, since the Firefox experiment is going to be way more tolerant if rust's shortcomings than the rest of the industry.


I think it's too early to say it's failed to replace the existing systems programming languages and its trajectory looks positive. For example, in the 2019 Stack Overflow developer survey[1] only 3% of professional developers reported using Rust. A year later[2] it was 4.8% and a year after that[3] it was 6.4%. So in two years professional developer use doubled, by that metric. For reference, in the SO surveys C++ and C use was about 20% and 16%, respectively and, if anything, slowly declining (as a percentage of respondents, of course).

1. https://insights.stackoverflow.com/survey/2019#technology-_-...

2. https://insights.stackoverflow.com/survey/2020#technology-pr...

3. https://insights.stackoverflow.com/survey/2021#technology-mo...


If I where to write a small command line tool utility that needs to be fast, safe and lean I would definitely write it in Rust. It's got the tooling C++ don't (cargo), the ability to statically compile a minimal binary that small and fast (because we don't pay for abstraction like C++) and allow me to write reasonably clean code using functional paradigm. It is reasonably easy to interface with C code (did that recently to access libelf and fuse). Unit testing and documentation is embedded with the language.

So, I agree with you, it will probably not become a replacement, but it succeeded in providing a credible alternative that has been used by many already.


There needs to be a GUI framework with some official support.


It’s hard to imagine what an officially supported GUI framework would be. And I’m focusing on the official.

There are toolkits out there and game engines picking up steam.

My guess is that for a GUI in Rust to truly shine it will have the hard task of working both in native OSes and WASM on the Web. People are working on these frameworks, but most are pretty limited at the moment depending on needs. (If folks believe it’s better than that, please share, it’s only a space I dabble in sometimes)


That's not a job for stdlib. Language needs to be flexible, with a small core. Rust is already hard to learn, even the language itself is packed with features.

Add the stdlib and you have plenty of confused faces who after some time fail to see the point.


I didn’t say it should be in the stdlib.


"With some official support" generally means that the library in question is in the stdlib.


The regex crate is owned by the Rust organization, but is not part of the stdlib. That said, I don't think there are any GUI crates that are as "standard" as regex is where this would make sense.


No it doesn’t. Tokio has some official support without being in the stdlib.


What how many languages have a GUI framework in the std?


Python has bindings to TCL/Tk in its stdlib. But citing Python is cheating, imo.


Java has multiple GUI frameworks in its standard library.


Python comes with Tk


GUI frameworks are defined by OSes, not languages.


Does anyone have any idea as to how many C++ shops have adopted Rust or how many C++ projects have been converted to Rust? That's what I want to know in 2022.


I think at this point, Rust will not really replace C++ just like how C++ never replaced C.

Given the learning curve of Rust and the lack of tools it's still a bit of a gamble to choose Rust over C or C++ IMHO. Why take that risk when the benefits aren't entirely clear cut.


Inventing programming languages is super-fun, but aren't we almost post programming language?

In 30 years, do we really expect to be programming in some machine-human intermediate language? I hope that we'll be having an open ended conversation with the machines to iterate our intent, with working software at most stages along the way.


> In 30 years, do we really expect to be programming in some machine-human intermediate language? I hope that we'll be having an open ended conversation with the machines to iterate our intent, with working software at most stages along the way.

This is actually entirely feasible, although perhaps not open-ended conversation. Instead it could be that you feed the machine training data which describes how to solve a particular task, and the output is a program that solves that task. So we would all end up being ML engineers.

Scary I know...


Perhaps more modestly - how far are we from just being able to translate from one high level language to another, assuming the ability to refine the translation as opposed to a one-shot job that has to correctly interpret the intent?

I.e. will it matter what language anyone uses? Here, take this software renderer I wrote in Python, and translate it to C++. "Conversation" about choices and ambiguities ensues...


Maybe there is the possibility of a meta high level language capable of describing/encompassing all the others without exceeding human cognition capabilities. Instead of compiling to a common low level target like assembler, transpiling all the existing sources to a very high level one optimized for human understandability.


I think we soon will have NLP capable of that. Too much money involved to resist.


> In 30 years, do we really expect to be programming in some machine-human intermediate language?

Just as when people asked this 30 years ago, yes, though from our perspective today the tooling may seem near magical.


Great post by Nick offering a very interesting fresh perspective from someone who was an active member of the Rust core team in the past, then took a break for couple of years and now comes back working on Rust full time (at Microsoft). Great overview of outstanding issues and priorities as he sees them.


I really prefers Zig. Rust code looks a bit unreadable...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: