That's why he doesn't like Rust is so big on C and Go.
I don't buy into this mentality. It's just wasting time. Your user suffers due to bloat and slowness on commercial software (like websites full of tracking and heavy UI design) due to economical reasons (no benefit in optimizing user perf for the company) and not due to system software like one Drew writes.
It's just kind of a LARP of "old school" and "minimalistic", which is fine. Just let's not pretend there's some practical benefit.
Note however that (due to Intel's complacence, AMD's inability to compete in the construction vehicle series of chips era, crossing several good-enough performance lines, upgradability and the general slowing of Moore's law at least for CPUs) using a 2012 CPU in 2022 is far more viable than using a 2002 CPU in 2012.
In 2022 a Sandy Bridge (2012) computer is still a viable computer and viable for the foreseeable future. In 2012, a late Pentium III/early Pentium 4 was not yet an antique but already severely outdated.
I'd argue that's a myth. If you use peripherals from 2012, and do simple office tasks, maybe. Try watching 4K (or FHD 50 fps) video on a 4K display on Sandy Bridge with a GPU from that era. Try editing iPhone video and raw photos. Even compiling Linux is order of magnitude faster, because of 20 faster threads instead of 4.
Speaking as someone who did have a computer from 10 years ago as the main computer until a year ago, no, it's very realistic. At no point did I come across a game that was "my computer isn't powerful enough to play this game." Even compiling large projects is less helpful than you'd think: a clobber build is still context-switch-while-it-finishes, and incremental builds usually don't build enough for all of the extra threads to be worth it.
Sure, maybe my computer might have screamed at me if I tried to watch AV1 or H.265 4K video, but... my computer is 10 years old and so is my monitor, so I don't have a 4K monitor. And, quite frankly, I don't see the point upgrading from 1080 to 4K.
Editing 4k videos/raw photos is an outlier as far as PC tasks go.
Your argument "that's a myth" is based on a task that's so small in terms of percentage it's literally a rounding error.
I'm typing this on a 2010 computer (i7 870 @ 2.93GHz, 16GB RAM), which I also use to write software (C, Lazarus, JS, Go using VSCode, Vim, Emacs), browse web, use web apps (miro, tinkercad, asciiflow, a few others).
Inkscape works acceptably, light image retouching using Gimp works acceptably.
I used to play Starcraft II, Far Cry 3/4/5 on this machine until an upgrade nuked my Wine installation and I didn't bother to fix that.
My steam library is filled with games that work on this machine.
Out of the thousands of tasks that people use their computer for, two of them can't be done acceptably on this machine.
Probably not that many, but the parent post to mine specifically mentioned editing iPhone footage.
A lot of people editing iPhone footage are going to use a phone-native app like CapCut, but even with that coping with the horribly inefficient compression of h.264 and h.265 footage gets them up to egg-frying temperature very quickly.
My wife was rocking an Ivy Bridge MacBook Air until just this year and it was working totally fine on all non-browser activities. The only thing that caused her to switch (which she didn't want to do) was that her browser with the 300 tabs open was crawling to a halt.
> It's just wasting time. Your user suffers due to bloat and slowness on commercial software (like websites full of tracking and heavy UI design) due to economical reasons (no benefit in optimizing user perf for the company) and not due to system software like one Drew writes.
And if system software developers believed that line of reasoning, then that would be slow as well, and if the commercial application vendors gave half as much attention and effort then those wouldn't be so bloated and awful.
> Your user suffers due to bloat and slowness on commercial software (like websites full of tracking and heavy UI design) due to economical reasons (no benefit in optimizing user perf for the company) and not due to system software like one Drew writes.
Did you not read the article? He makes a direct comparison between two window managers and the different levels of performance they provide. The system software obviously matters.
Moreover, there's huge benefits to optimizing the performance, even from a corporate perspective. There's just often not many market driven reasons to do so, and so many companies are willfully blind to this truth.
So, you assertion seems to boil down to "everyone else does it in a crappy way because they are ignorant, so you should too, to avoid 'wasting time'." It's a bizarre take.
<< It's just kind of a LARP of "old school" and "minimalistic", which is fine. Just let's not pretend there's some practical benefit.
Hmm. Like with most things, there are benefits and drawbacks to this approach, but outright dismissal as 'LARP' is, at best, uncharitable. Are you sure you cannot come up with at least one good practical reason?
I'm not dismissing it. Doing stuff certain way because it makes them fun for you is OK. Myself I'm not far from Drew with my dotfiles, doing everything in CLI and being a die-hard Vimer. And I love embedded, low-powered devices too. But I'm not going to pretend there's some big practical reason to do things this way. My VSCode-using mouse-clicking friends are as productive as I am, and produce similar software.
The idea that using underpowered computer makes you write some faster or better software etc. is just silly. Especially that all the really fast and well known projects in last few years tend to come from Rust people, with their super heavy compiler that has no respect for RAM and storage usage.
It's by definition always better to have a faster box than slower one. You want to see how it works on much slower boxes? Run it with a CPU quota or in a VM. People don't do it, not because they can't, or because they have too fast computers, but because they have other priorities.
<< The idea that using underpowered computer makes you write some faster or better software etc. is just silly.
I will admit that I would rather someone with more experience commented on this rather than me.
That said, I disagree on general principle.
When you are restrained ( sometimes severely restrained ) by the environment, the net result is that you use resources a lot more efficiently. I absolutely get that we are kinda spoiled now with crazy hardware that is basically a supercomputer from not so long ago and it is hard to let that convenience and power go, but that lends to the bloated software landscape we have today that manages to overuse just about every resource available.
With limited power, you are effectively forced to write better. It is not completely unlike training with weights. It is not silly. It produces real results.
I did not take his point to be that low powered hardware was superior. Rather, he states that older hardware is fine and good enough. That has been my experience as well. You state something similar above.
Developers should be using hardware comparable to what their intended audience uses, at least some of the time. It's hard to effectively address the needs of your audience if you aren't experiencing their problems.
I have to disagree, devs should use hardware appropriate to their development environment. They should test on hardware appropriate to their target environment (or at least a simulated version of it). Insisting on developing on underpowered systems can block you from using some more useful development tooling like many profilers and some debuggers effectively or even entirely. You need to know that it works in the target environment, but if the target environment is a 2005-era CPU because the customer wants a "ruggedized" laptop (aka, overpriced old hardware in a very tough case), there's no reason to restrict your devs to one core and 4GB of RAM for their development tools.
I don't buy into this mentality. It's just wasting time. Your user suffers due to bloat and slowness on commercial software (like websites full of tracking and heavy UI design) due to economical reasons (no benefit in optimizing user perf for the company) and not due to system software like one Drew writes.
It's just kind of a LARP of "old school" and "minimalistic", which is fine. Just let's not pretend there's some practical benefit.