You and me both (though I'm also annoyed by the overuse of macros instead of inline functions and the woeful lack of docstrings and type declarations).
Macros instead of inline functions is just archaic, not necessarily non-idiomatic. I'm not sure when inline declarations were added, but even well after that point macros were more reliable for getting inlining.
Calendrica has been around for that long already? How time flies. Was also interesting to find an off-by-1 bug in final (if the condition doesn't hold for initial the result is smaller than initial instead of greater than or equal to it). Interesting little program to read, though.
I did a bunch of distro hopping in the 90's but locked onto Debian (mainly testing, now largely unstable) not long after. I'm still just not sure what compels people elsewhere. Especially now: the Debian installer was vicious if you were a newbie, but I hear it's pretty ok now.
This is largely a me problem! I don't understand what the value add is of other offerings. It's unclear what else would be good or why. Debian feels like it really has what's needed now. Things work. Hardware support is good. Especially in the systemd era, so much of what used to make distros unique is just no longer a factor; there's common means for most Linux 's operation. My gut tells me we are forking and flavoring for not much at all. Aside from learning some new commands, learning Arch has been such a recent non-event for me. It feels like we are having weird popularity contests over nothing. And that amplifies my sense of: why not just use Debian?
But I also have two and a half plus decades of Linux, and my inability to differentiate and assess from beginner's eyes is absolutely key to this all. I try to ask folks, but it's still all so unclear what real motivations, and more, what real differences there are.
Desktop mode on the Steam Deck uses X11. I think that's why the brightness control is fucked up (ever notice how the brightness is always 40% when you switch to desktop mode?). You can manually switch it to Wayland, but Steam input is broken under Wayland (or at least it was last time I checked, which is admittedly half a year ago or something).
That's odd, I'm using Wayland on my desktop and, for example, Japanese input works 問題ない. Then again, I haven't tried every possible input method/peripheral in existence, so I may just be the one in [some arbitrary large number] who lucked out.
Separate scaling fractions on separate monitors doesn't work under X. Well, I lie: it does work under zaphod mode, but no applications other than Emacs support that.
Heh. Just today I started fooling around with a new X11 setup on a barebones Ubuntu Server VM with just xorg, xinit, xterm, Emacs and i3.
It's pretty neat learning about iommu groups and doing NVMe passthrough with KVM/Qemu, and also messing around with the new (to me) Spice/virgl 3D acceleration. I was impressed I was able to play YT videos in the Ubuntu Virtual Machine Manager with hand-built mpv/ffmpeg + yt-dlp setup without dropping too many frames or serious glitches. Huzzah for libgl1-mesa-dri.
After that, I rebooted the host OS, jumped into the UEFI boot menu and booted the "guest" NVMe disk directly with my actual GPU, and it still worked. It's quite a trip down memory lane, typing 'startx' and having a both a :0.0 and :0.1 displays. That muscle memory from the 1990s is still going strong.
I miss the simplicity of how I remember XFree86 running on the alt-f7 terminal, and having alt-f1 through alt-f6 for my own needs... a second X on alt-f8 when I got 64MB of ram. ctrl-alt-backspace to quickly kill X and restart it (within a few seconds on a 486).
Then, gradually, these things disappeared from Linux, for no good reason; you can still configure them but someone decided in their infinite wisdom that some of the most compelling features just weren't really needed anymore, in favour of rewriting the XDM again and again until now there's too many of them and none of them are really any better than what we had in the 90s.
I had to put that in my .xinitrc, because like you I really missed that feature. I also made a .Xresources file and had to remember that xrdb was a thing. Good times, good memories. I also remember the jump to 64MiB of memory, it was a big deal! I think I got a Gravis UltraSound right around then too.
I stopped my nostalgia journey short of pimping out my console (sadly now only fbcon works, and the old vga modes are a legacy BIOS thing I think) with fonts and higher resolution, and enabling in the kernel the Alt+SysReq+g key for dropping into the kernel debugger, but there is always tomorrow!
Running X11 on Ubuntu 22.04 - I have a 2650x1600 main at 150% scale and a 1920x1980 secondary at 100% scale. Essentially they're the same virtual size side-by-side. This _only_ works on my nVidia GPU...
The way to get a website for your small restaurant used to be having Jim's nephew make one for you and you'd give him a pizza and a six pack as payment for setting it up.
As someone who also first got introduced to Lisp through SICP and Scheme, I don't really care about Lisp-1 versus Lisp-2, but I don't much fancy minimalism. I switched to CL for the type declarations and just got used to funcall and number sign+apostrophe; and minimalism means the things a larger language would provide out of the box (say, hash tables) you either need to write yourself or find a library for. Hence why various Schemes (Guile, Gauche, Chicken) have a ton of libraries beyond the standard.
In fact, I'd say CL is too minimalist, hence CLDR (a bit like SRFIs) and various libraries which form a sort of unofficial extended core (Alexandria, Bordeaux threads, CL-PPCRE,...)
But there's a value in having a defined, stable language definition. Being able to rely on the basic language not changing is a feature, not a bug. Though it does mean you have to sometimes search for a good lib if you don't have a feature built into the language.
My comment said nothing about language permanence, though I would say that some measure of evolution can sometimes be for the better. I doubt many people would prefer programming in Java 1.4 over Java 21.
The sun won't go supernova, but it will become a red giant in about 5,000,000,000 years, which will have roughly the same consequences for life on earth.
To be honest, I can't quite predict if humanity truly survives within the next decade or two/three or a century. Let alone millions of times more than that amount.
We humans are very likely to be our own worst enemy. I would wish for the world to exist till the 5_billion year date that you mention.
reply