Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I shouldn’t be surprised, but desktop environments using >1G of ram as standard strikes me as insane.

I get that they’re doing a lot but my first custom built PC had 1G of memory for the entire system (and that was absurdly large by those days standards; somewhere around 2005).

I don’t see terribly large improvements in functionality between 2005’s gnome2 and what exists now in mate or XFCE, bearing in mind that I was playing with beryl/compiz at the time too.

Does anyone know what’s contributing?



I feel similarly about Windows. Way, way more resource hungry. Tens to hundreds of times, depending on the metric. Barely does more than Win95, as far as user-facing features. Shit I bet most of the important features would have fit in 5-10MB of "power tools" addons with barely-measurable effects on memory and cycle use. Win95 ran very comfortably on 64MB (with an M) of memory and a 4GB disk. You can't even install the OS on a disk that small anymore, let alone have room left for anything else.

As for Linux, I remember using Windowmaker for a long time, and later XFCE, because Gnome and especially KDE (funny how those have reversed position performance-wise now, eh?) would make my poor Celeron laptop cry. But they did run, and didn't even come close to maxing out my 384MB of RAM in that machine.


>I feel similarly about Windows. Way, way more resource hungry. Tens to hundreds of times, depending on the metric.

You're incorect here as that's not an apples to apples comparison. If you've got the RAM, Windows (and MacOS too I bet) does a lot of caching on boot of frequently used apps and files to RAM, as any sane desktop focused OS should do to give you a good experience, and it frees it back up as it starts to be needed by other apps. If you've got free RAM, why shouldn't the OS put it to good use to improve the UX?

I never understood the obsession of many enthusiasts, to spec PCs with huge amounts of RAM, then smugly compete for the least amount of idle RAM used by their system. That's not the metric of a good desktop OS. What do you gain by just staring at large amounts of RAM that you paid for just sitting idle and unused by anything? RAM is there to be used, and if free RAM is available and the OS is smart about using it and freeing it on the fly to boost the UX, then please by all means go ahead and use as much as you want if that will improve the UX.

My parents run Windows 10 on my 11 year old 4GB laptop with a spinning rust drive, using Chrome as a browser and it works without ever crashing or needing to kill apps, so IMO, Windows is stellar at managing memory/resources even when in very short supply. Meanwhile on the same laptop, Ubuntu's Gnome would just completely lock up after a long Chrome browsing session, or its OOM would just straight up kill Chrome, both cases yielding a much worse UX than on Windows, making it the far superior desktop OS in this case.

IMHO a better and more realistic metric should be idle CPU usage, and how the OS deals with low memory scenarios, like the one I described above, not by how little idle RAM it uses on PCs with large ammounts of it as that's just pointless.


I think that's missing the point of the GP, because our Windows 9x machines with 64 MB of RAM also had plenty of room for cache. Of course, there have been real advances in GUIs since Windows 9x, and the 9x kernel was hardly state of the art even for its time. But I don't know if a minimum of 2 GB of RAM, and probably more like 4 GB for a usable system, is really necessary.


You can't apples to apples compare windows 95 RAM usage logic with the one implemented in Windows 10/11. Of course old OSs used less.


>Of course old OSs used less.

"Why" is the question here. New cars don't use more petrol. And new bikes are not heavier then old ones.


But then they turn around and burn 100x (or more) the cycles-at-idle of 90s operating systems and constantly hit disk for no clear reason despite all this supposed performance-enhancing caching (try using Vista or newer on spinning rust—it's very clear that they're constantly doing disk I/O for normal shit like opening the start menu or just mousing over some UI elements, so much for "smart" caching of frequently-used things at boot).

I absolutely do not think intelligent caching is behind enough of the increased memory use to absolve them of all the waste. I don't believe Microsoft are being more respectful of memory than they are of other system resources. It also fails to explain the huge increase in memory footprint of Linux desktop environments and window managers. I don't think XFCE is caching your most-used "apps" in memory at launch.


Free RAM is used as disk cache and sacrificed to web 2.0 jabbscript SPA blogs that try to improve your browsing experience by consuming more RAM than you have.


OS RAM usage and website browser RAM usage are two different things.


They consume the same RAM. There's no infinite web 2.0 RAM.


My point was about who's using it. You can't blame the OS that bloated websites and browsers need too much RAM.


I’ve got 128GB of RAM and I would _love_ my OS to use it. But unless I’m explicitly blocking out memory by running a VM or loading a large dataset I’ve never seen it go above 20GB, and even that’s after turning off paging and tweaking windows to cache stuff more aggressively.


Try ZFS.


That’s a common argument; but in Linux these are reported differently.

I believe it is the same in windows and MacOS too.

So “used ram” from the OS perspective cannot be reused as filesystem caches.

I doubt they’re increasing performance either as the overall performance of such a system is so low.

it’s also hard to argue increased performance when your app could have instead fit in L3 cache.


What hardware giveth, the software developer take the away. For a variety of reasons, today's devs are terrible at writing software that is even remotely efficient.


for i in 0..1000000; do long slow test echo "not last" > /dev/null; if last thing then do slow test again to be sure then echo "last"; done


> Win95 ran very comfortably on 64MB (with an M) of memory and a 4GB disk.

I ran it on 8mb ram / 850mb disk. Did upgrade to 16mb ram and the performance boost was palpable, but, again, used it at 8mb for a while. Also a fresh install took up a whooping 50mb!


Right, hence the "very comfortably". That wasn't min specs, that was a very happy Win95 system.

> Also a fresh install took up a whooping 50mb

I do wonder WTF modern Windows is doing with tens of GB of disk space, not even counting swap & such. Seems like it jumped up dramatically after WinXP—even the otherwise-kinda-decent Win7 used crazy amounts of disk. "They started including a bunch of drivers" bullshit, Linux has always included loads of drivers with most default installations and it's so much smaller that just calling it "smaller" doesn't do the difference justice, plus there are surely a lot of those that could easily be made a download or optional default-off package (extremely niche or very old hardware).


> my first custom built PC had 1G of memory for the entire system

Windows 95 ran on 4 MB, and very well on 8 MB. I believe I ran Linux with fvwm2 on 8 MB as well.

Of course 64-bit and higher resolution increase RAM requirements to some degree, but I’m still not sure what Openbox needs 600 MB for. ;)


> A Spellchecker Used to Be a Major Feat of Software Engineering

> Fast forward to today. A program to load /usr/share/dict/words into a hash table is 3-5 lines of Perl or Python, depending on how terse you mind being. Looking up a word in this hash table dictionary is a trivial expression, one built into the language. And that's it. Sure, you could come up with some ways to decrease the load time or reduce the memory footprint, but that's icing and likely won't be needed.

https://prog21.dadgum.com/29.html

5MB here. 5MB there. No one notices ;)


I wonder how much of that comes from careless programming (accidental complexity), and how much is essential complexity in keeping with user expectations for polish & responsiveness (driven by Apple, etc).

Another possible source of the problem I can imagine might be the ubiquity programming style of using high-level languages and important libraries on a whim. If the language does not strip out unused code paths in dependencies, sizes can quickly add up with the same libraries being reimported multiple times.


When not much resources available one have to be more careful (about not accidentally creating O(N^2) where O(N) would work and not wasting RAM on multiples copies of the same data) and at same time one have to refuse adding non essential features with bad HW resources / user value ratio.

In corporate software it is often easy to find some low hanging fruits allowing to reduce CPU/RAM usage by 10x (but incentives stacked against doing this). In open source software like KDE/Gnome I think it would be hard to find blatantly inefficient code and the main reason is the overall complexity which is hard to reduce without sacrificing some features.


In 2005 my resolution was 1024x768 on a 60(?)Hz screen. That's 2MB per frame. Right now I have 2x 2560x1440 HDR monitors with 144Hz refresh rates hooked up to my machine, with double buffering is 100MB just for screen buffers. I believe it was Windows Vista (apologies I'm not confident in my linux versions) that introduced the DWM compositor, meaning that all of a sudden you need all of the frames of all of the windows in memory x 2.


When you feel your desktop is too heavy, bury your woes by firing a web browser.


I started using Ubuntu 8.10 with Gnome 2 and Compiz on January 2009. I've stopped thinking for a while what I'm getting with Gnome 3 on Ubuntu 20.04 now that I didn't had in 2009. The only important feature I came out with is gsconnect / kdeconnect which maybe would have been impossible with Gnome 2 but maybe not. Everything else feels like nice to have small esthetic improvements which I would do without if I think all the time I had to spend configuring out all the new stuff I didn't like. I used Gnome Flashback until the upgrade to 18.04, when I was sure I could put together a dozen Gnome shell extensions to get a desktop I could live with. Switching back to Windows or to a Mac with the OSX GUI would have been worse for me. Tastes.

Then there is the inevitable maintenance to keep the desktop environment compatible with the world that is changing around it. Kernels, drivers, etc. "It takes all the running you can do, to keep in the same place."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: