Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Debian GNU/Hurd now can compile 71% of the packages from the Debian archive (gnu.org)
82 points by pjmlp on July 5, 2024 | hide | past | favorite | 39 comments


The really "weird" or "funny" thing about microkernels IMHO is that in a way, everyone is running one nowadays… it's just distributed across the many processors in your system. SSDs have a processors, GPUs are giant processors, NICs have processors, chipsets have processors, USB controllers have processors, sound chips have processors, even the processor has… smaller extra processors.

PCI Express is the IPC standard for most of these and IOMMUs have brought in proper isolation.

… in a way, it's a very inflexible, inhomogenous and intransparent microkernel getup.

I think Apple are currently the only people who actively understand this and are trying to use it to their benefit, with their ongoing deep integration of peripherals. I just wish they'd get rid of the inflexibility and intransparency :(


There is a reason these small processors exist. The system would be significantly more complex and less efficient without them.


Genuine question; does anyone here run a GNU/Hurd based OS as their daily driver?

I've always thought Hurd seemed pretty cool but the thing that stops me from using nearly any OS less popular than Linux is always driver and software support.


I've tried. It's not usable as a daily driver OS for most of the HN audience. You have to make a lot of compromises and really believe in Hurd.


What sort of compromises? I would speculate that performance would be a big issue for a project that hasn't had the sheer number of eyeball-hours on the code like Linux has had, but I really have no idea!


From previous Hacker News comments, there is a perception that even though Hurd is a microkernel, it wasn’t a very good attempt at one.

https://news.ycombinator.com/item?id=11320648


Driver yes of course. But regarding software, I think now we can have a very decent and usable machine if it can only run Firefox/Chromium (but perfectly so).


The world is full of many ironies, but perhaps one of the largest in computing is that Hurd, a project that began in 1990, was not only beat in time to market by Linux; but was even upstaged by Nintendo (a glorified toy company) with their microkernel-based Switch OS.

(Edit: https://media.ccc.de/v/34c3-8941-console_security_-_switch and https://en.m.wikipedia.org/wiki/Nintendo_Switch_system_softw... )


I got the impression that the FSF rather downplayed their own kernel development after Linux became available. I remember reading things like, you can still work on Hurd if you want to, but it's more of a research project.


I recall that as well. Linux basically filled the need/goal they wanted with Hurd, so it became far less important.


[flagged]


Perhaps it should be called Systemd/GNU/Linux these days.

Beyond glibc and coreutils, does GNU really bring much to the party any more? Bash - don't use it. Clang seems to be inexorably taking over from gcc.


Let me know when the mainstream distros have switched their kernel builds to Clang.


I wouldn't say taking over (not anymore) but yeah the fact that it's basically an almost 100% "drop in" alternative to gcc for Linux makes GNU software a bit less relevant/central to Linux. For better and for worse I guess!


Seeing this in reply to an indisputably correct usage of the word "Linux" is quite silly


Poe's Law prevents me from knowing how to interpret this comment.

(oh nm it's literally a copypasta so in that case almost certainly posted in a spirit of parody)


Oh if only it were that gracious. My impression was FSF spent some time scolding us that we should wait for Hurd (not to mention Guile). But the Hurd project got deeply bogged down and never got to usability. The closest to admitting reality was the time Stallman started saying we should talk about "GNU/Linux" and not "Linux".


TBH, Stallman's baby has always been the compiler suite and the development environment. As far as he's concerned, 99% of the moving parts are in the GNU part of the environment and people should acknowledge this by using the name.


IMO in hindsight we can say Linux was probably the technically superior choice.

I can see the appeal of microkernels in theory, but in practice? Here's what I gathered are the general problems:

* A microkernel won't stop some very bad things, like filesystem corruption. Some kinds of bugs are still a huge pain.

* Devices like hard disks and video cards have gotten smarter. They have lots of state, their own firmware, and can possibly DMA over system memory bypassing OS protections. This means that the effectiveness of splitting things into processes is likely limited when the video card can just take your entire system down anyway.

* The "system can tolerate a part of it crashing" is great, IF anyone bothers to take advantage of it, and IF things like hardware can be properly reinitialized. In practice your desktop probably doesn't account for that so your user session goes down. Might as well reboot.

* HURD specifically used to be limited to a 32 bit address space for filesystems which means it couldn't handle disks larger than 2GB (or some such size) for a long time.

* HURD processes can't trust each other. You can't have part A tell part B to drop their cache reliably to free up system memory. Every bit is separate from another, and can just ignore such a request.

* HURD specifically contained design decisions made to appeal to Stallman's pet peeves about sysadmin power. This is irrelevant today, but likely hampered adoption.

* Performance is bad.

* A monolithic kernel can always imitate a microkernel where convenient. The reverse doesn't hold true.

HURD introduced a bunch of difficulties and limitations. Linux could mostly ignore the costs of a microkernel and get things done well enough much faster. Where microkernels had some upside like usermode filesystems, Linux could just hack it in and only pay the price if you wanted that.


> "This means that the effectiveness of splitting things into processes is likely limited when the video card can just take your entire system down anyway."

Windows moved to user mode display drivers (https://en.wikipedia.org/wiki/User-Mode_Driver_Framework) at least in part Microsoft was getting blamed for many blue screens caused by the extremely crappy video drivers of that era. Seems to have worked out for them.


> Performance is bad

Is there any Benchmark?


Here: https://www.phoronix.com/review/debian-hurd-2015/2

Some are not bad, but some are awful. It's going to depend. A microkernel means more context switches, so in general it should be guaranteed to lose. By a lot if you get into the really pathological cases.


Xbox is a cousin of Windows, PS4/5 is a BSD. Well and good, but the fact that Nintendo still roll their own OS warms my heart. At least someone is trying a new thing!


What?! I thought Nintendo was building on BSD.



This is insane, but don't their developers also hate them. Like I understand having the opportunity to engineer and OS from scratch and have millions of people using it.

But after the honeymoon ends and you now have to build and add features to it as the platform grows and you realise you don't have an army of OSS contributors helping you ship and innovate. Isn't it frustrating for them?


Probably not. The Switch OS is very small - under 400MB. And that’s including the NVIDIA drivers and WebKit (which is maintained by a third party called NetFront, so not their problem) and all graphical assets.

When you strip it down to the actual amount of code Nintendo maintains, by ripping out the NVIDIA, NetFront, graphical and audio assets, it’s manageable. Compared to the complexity of the hundreds of video games that Nintendo has worked on, it’s probably not as difficult as it would seem.

There’s a reason why it’s small though - every game card comes with a copy of the required OS version and can self-update the console if needed. Not only can the game card update the console - other Switch players, using the version synchronization feature, can even update each other’s consoles wirelessly without internet for both the game and OS.

> but don't their developers also hate them

No idea where you got this from. The primary complaint regarding the Switch is the dated chip and performance metrics, not the system. Nintendo as a company meanwhile has a 98%+ retention rate. There were five people who worked on the original Super Mario Bros in 1985, all five of them worked in some capacity on Super Mario Bros Wonder in 2023.


But that retention rating seems more related to Japanese corporate dynamics, the whole silent contract of dont ask for more and we won't raise prices.

Anyway if the entire OS is hand rolled how are they able to get WebKit and Nvidia drivers to run without much effort.

Like I'm assuming they very much needed to build their own linker, their own executable format. Even their own libc, sounds like a lot of work to QC, document and implement.

If it were their own flavour of Unix I wouldn't have so many questions. Unix already defines standards pretty well like file systems, process management, scheduler, drivers, a plethora of helpful daemons. So is it basically just a highly modified Unix, implying a lot of Unix things just has to be ported over to their micro kernel Unix or is it more proprietary.

I ask this with just a thirst for this knowledge, I talked to a few enthusiast OS Devs that build their own toy kernels, bootloaders etc. from scratch and those things, though being great accomplishments, still have huge pains they never stop talking about. Like USB drivers are notorious to implement because of the standard being so complex. If Nintendo made their OS entirely from scratch did they go through all of that.


The Wikipedia Switch link you shared says that the Switch OS is a derivative of the 3DS OS[0], which is also a microkernel architecture according to its Wiki page.

[0] https://en.wikipedia.org/wiki/Nintendo_3DS_system_software


Huh, I always assumed the switch ran on some flavour of custom android, I suppose for not other reason than it seems every digital device in the last 10 years has done so.

https://en.wikipedia.org/wiki/Nintendo_Switch_system_softwar...


If it was any sort of Android, we'd probably have known about it, and been running all sorts of Android things within minutes of its release.

Given that the Tegra X1 chip it runs was already a couple of years old when Switch released, they might have had to perform some magic to get performance what it is even with a cut-down Android; even under their custom OS, their flagship game, TLoZ: Breath of The Wild, doesn't run smoothly.


I always thought the Switch OS was based off of FreeBSD, I guess I have my facts wrong.


The Switch uses bits of FreeBSD for Networking, but so does Windows. There are also bits of the Android display stack (NVNFlinger sounds a lot like SurfaceFlinger), but none of that changes that this a mostly homegrown Nintendo project.

https://media.ccc.de/v/34c3-8941-console_security_-_switch


Windows is BSD clean since Vista, when the networking stack was rewriten.


Long live GNU/Hurd! I remember trying it out over a decade and a half ago. It had an impressive number of packages then too. Performance, drivers, drastically different security models combined with that lack of a sizable community are what stopped me then. I feel like the world peers in on occasion and silently roots, “Go Hurd!”


What was the difficulty with its security model?


Not really a difficulty, just different enough that I never wanted to try it for anything real.


It is written in the annals of the galaxy that the Great Prophet Zarquon on his return will bring with him the first publicly useable version of GNU/Hurd which will signify a change for the universe so great as to be incomprehensible to mere mortals.

Elsewhere it is written that Zarquon shows up in Milliways™, the restaurant at the end of the universe where he'll utter the following phrase just before the end of the universe: Sorry I’m a bit late, had a terrible time... All sort of things cropping up at the last moment. Uh, how are we for time?

It is left up to you, dear reader, to interpret in what way the Coming of the Hurd (and Zarquon, naturally) will signify a massive change for the universe.


Thank you, Mr Celine.


Kind of an aside, but it's interesting to see Debian mentioned positively on gnu.org, since they consider Debian a non-free distribution [1], and I see that they have even updated its entry recently, after Debian decided to include non-free firmware on its default installer.

[1] https://www.gnu.org/distros/common-distros.html#Debian




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: