I don't see a future where C survives, not only because of memory corruption bugs (although that's a pretty big one), but also for usability: the lack of package manager, common build system, good documentation, good standard library, etc. are just too much to compete with any modern system language.
I've been seeing those exact words for decades now, and C is still going strong. Every few year a new language comes, somes writes something in it, that was written in C before, someone might even write a basic OS in it, and after a few years, that language is almost forgotten, a new one is here, and again, someone is writing something in it, but in the end, we still use C for the things we used it 10, 20, for some, even 30 years ago.
Usage of C in new projects has fallen dramatically in the latest decades. It used to be the case that C was considered a general purpose programming language and applications such as Evolution were written in it. Today big applications in C are increasingly rare, and Rust is only accelerating this trend - nobody wants to have buffer overflows anymore.
> lack of package manager, common build system, good documentation.
This is where C is superior to virtually every other language. It has K&R to start with [1], a wealth of examples to progress from there, man pages, autotools, cmake, static and shared libraries.
> good standard library.
It should have hash tables at least, but it isn't bad.
[1] Which is still the best language book ever written (yes, it has some anti patterns, you unlearn them quickly).
Huh? In what way is C’s books, documentation or build system superior to that found in other languages? Most languages have plenty of good books written about them. And plenty of code examples online. I can’t speak for other languages but I find MDN (Javascript) and the rust docs consistently better than C’s man pages. Ruby’s documentation is great too.
As for build systems, autotools is a hilarious clown car of a disaster. You write a script (automake) to generate a huge, slow script (configure) to generate a makefile to finally invoke gcc? It is shockingly convoluted. It seems more like a code generation art project than something people should use. CMake papers over it about as well as it can, but I think cmake is (maybe by necessity) more complex than some other entire programming languages. In comparison, in rust “cargo build” will build my project correctly on any platform, any time, with usually no effort on my part beyond writing the names and versions of my dependencies in a file.
And as for package management, C is stuck in the 80s. It limps by, but it doesn’t have a package manager as we know them today. There’s no cargo, gems, npm, etc equivalent. Apt is no solution if you want your software to work on multiple distros (which all have their own ideas about versioning). Let alone writing software that builds on windows, Mac and Linux.
So no, C is not superior to other modern languages in its docs, build system or package manager. It is vastly inferior. I still love it. But we’ve gotten much, much better at making tooling in the last few decades. And sadly that innovation hasn’t been ported back to C.
Sure, it has not been standardized, it is not part of the standard library, so what? Did the world stop? I mean, practically speaking, who cares? Implement it, or find libraries that did. There are plenty. I posted this one because it exists for an OS; OpenBSD, since 1999. Plus, AFAIK ohash is portable enough. It consists of 2 files, and you can compile it with -std=c89. Only the bounded attribute is ignored.
If you want I could have brought up hcreate, hdestroy, and hsearch:
> The functions hcreate(), hsearch(), and hdestroy() are from SVr4, and are described in POSIX.1-2001 and POSIX.1-2008.
I use stb myself, so I have no qualms with that. The point is rather that GP was discussing praise for C’s standard library, and even the most portable single-file include-only dependency remains just that: an external dependency that isn’t part of the C standard library (and no, posix isn’t C).
Does it make much of a difference though? Take the hyped Rust for example. Most useful stuff is in crates, i.e. an external dependency. No one seems to have a problem with that.
Personally I do not mind using libraries typically installed by the Linux distribution's package manager anyways.
If the question is whether or not I think the C standard library could be improved, then yes, I would say it could, but I do not want it to have a hash table and all sorts of stuff like that, because there are lots and lots of ways to implement them, and they might not suit my needs. C is great, because you can build it from the ground up (if you want to) to make it specifically for your use case. It gives you the building blocks. I believe I have a comment regarding this somewhere, that I like C because it does not implement stuff for you that is in some ways "generalized", which is often a bad thing. This is my problem with "it should have hash tables at least". You cannot implement it in such a way that it suits everyone's needs.
Rust not having a good standard library is a huge problem. This increases the risk of a rust codebase due to the high number of third party dependencies.
I only said that "I do not mind using libraries typically installed by the Linux distribution's package manager", which was in respect to C.
As far as Rust goes, yes, I do not like that crates are full of one-liners, and so forth. It shares the same problems that npm has. I ran cargo build on many Rust projects before. No way.
What do you call linux distro's package managers then? I mean, in distributions like Debian you can even download a package's source code with apt-get.
>What do you call linux distro's package managers then?
If you want to count them as package managers, they're by far the worst ones of all the well known languages (with some notable exceptions e.g. guix's and nixos's).
They're not portable between distributions or even different versions of the same distribution (!), since it's non-trivial to install older versions of libraries (or, hell, different versions of the same library at the same time). Not to mention that it's a very manual and tedious process in comparison to all the other language specific package manager. 'Dependency hell' is a problem virtually limited to distro package managers (and languages like C and C++ that depend on them).
Getting older, unmaintained C programs to run on Linux is an incredibly frustrating experience and I think a perfect demonstration of how the current distro package manager approach is wholly insufficient.
> If you want to count them as package managers, they're by far the worst ones of all the well known languages (with some notable exceptions e.g. guix's and nixos's).
The have the only feature I care about: cross-language dependency management.
Unless you are suggesting to reimplement everything in each language and then make users install ten different XML parser, SSL implementations, etc. just because not-implemented-in-my-favorite-language syndrome.
Those are features which makes C flexible on main-stream platforms and also usable for so many other platform where other languages just don't/won't work.
I don't see C being in much worse shape than C++ with respect to build system and package manager. It's slow going, but progress seems to be happening there.
Are you saying both are doomed? Or is there some scenario where C++ survives without C?
I think both are, long term (think FORTRAN where it’s not particularly popular but a lot of existing code is maintained and not rewritten).
C++ is actually in a slightly better spot ironically because it’s harder to integrate with. If you have a C program you can pretty easily start replacing parts with Rust. You can’t do the same with C++ which insulates it better in that sense.
Reports of Fortran's death (latest standard 2018) are greatly exaggerated (much like C). It's receded to a niche, but it's still a very important niche (numerical, HPC). Hopefully, the development of a new Fortran front end for LLVM (from PGI/Nvidia?) pans out, as this would fill a gap in LLVM's offerings, and provide more competition for ifort and gfortran.
I definitely like the "lack of package manager, common build system". For me, having those is a negative for a language like rust.
You see, my OS already comes with those, and I expect to use them. I have the Debian package system: dpkg, apt, aptitude, and so on. It's a big mess when other software tries to steal that role. I have the traditional build systems and more: make, cmake, autoconf, scons, and so on. If I'm building a large project with multiple languages, I'm going to use one of those tools. If a language wants to fight me on that, I'm not interested in that language.
Thanks to LLVM and GCC you can happily write embedded code in a higher level language, but the vendors don't bother supporting it because a lot of embedded coding isn't really what we would call software (no tests etc.)
Toolchains are one side, but garbage collection and big standard libraries are also a big reason. Anything with under a MB of RAM has a choice of several modern languages, but it is still basically just C, C++, Rust, Lua or MicroPython.
People have been looking at that for years now. I’m convinced it’s going to happen one day. It might not be Rust, but it’s going to happen that we will have different models for writing these kinds of things
Perhaps a combination of a language like Zig (a 1:1 replacement for situations where you really do want a lot of manual low-level control) and higher-level languages like Rust eating into more and more of the use cases.