Hacker Newsnew | past | comments | ask | show | jobs | submit | maxrecursion's commentslogin

I have been daily driving fedora on my laptop for over a year now and haven't ran into a single issue. Not saying you're lying, but if you are having that many serious problems it might be a hardware issue.

The OS is definitely stable and perfectly fine to use.


I've installed and extensively used at least half a dozen different flavors of Linux on about as many computers over the last decade or two. I don't think there's been a single time where I didn't have to work around hardware compatibility issues. Just today, I tried printing a few pages on Linux, and could not get a single one to come out. I'd queue a job, it would just disappear without anything happening. I'd unplug the USB cable, reconnect it, try again, a page would come out with nothing printed on it. I'd restart the printer entirely, try again, nothing would happen. I'd unplug the USB cable, reconnect it, try again. Oh, maybe it's working? Oh no, the printer hard-failed halfway through printing the page, so I have to unplug the power cable, restart the printer, scrap that page, and try again. And so on, and so forth, for about an hour. I've been through this dance several times now and probably wasted at least 10-20 hours (re)installing printer drivers, messing with cups and boot configs, etc. This time I finally had enough, moved the printer over to my Windows machine, plugged it in, and printed my fucking pages. I have many other examples of wasting time trying to get basic stuff like this to work on Linux.

To be fair, in general "hell is empty, and all the printers are here"

https://www.youtube.com/watch?v=pQGtucrJ8hM


Picking the location odurong setup is a hardware issue? I find that hard to believe.

And do you mean hardware issue or hardware incompatibility?

The former would most likely manifest itself across many operating systems, but if you mean the latter... why would that matter in terms of a given person deciding whether to switch to Linux?


AFAIK the location lock is an nvidia driver issue... or more precisely the pitiful state of the open source nvidia drivers.

Yes the hardware you chose or is given will heavily influence your linux experience. I kinda wished the community was more proactive making lists of "certified" hardware that is likely to cause the least amount of problems...


If we’re collecting anecdata, I installed Fedora fresh on a framework a couple months ago. I like the cohesiveness of GNOME these days but i’ve seen a couple of issues like non stop notification bells on repeat or inability to wake screen when plugged in to a monitor that feel not prod ready.

I don’t know that i’d expect windows to be much better either, but that’s my experience.


Me as well. Never had any of those issues.

The same hardware runs windows and hackintosh flawlessly.

I bet this is a Windows-certified hardware. Try something designed for GNU/Linux?

Idk and I don't care, you're making my point. I'm in tech and I don't even know what "hardware designed for gnu" means, normies won't either, they have a computer, they want an os. If the os shits the bed during the initial location selection they won't swap their wifi module or motherboard for a "gnu compatible" one, they'll install windows

> I'm in tech and I don't even know what "hardware designed for gnu"

Honestly, I knew I was in a tech bubble but not that much. I mean, if you want your hardware to work with Windows/Linux/MacOS, you just buy it with the OS preinstalled. If you install something unsupported, non-working devices is your problem, not problem of the OS or hardware.

Just like MacOS will probably not work on a Windows laptop, Linux will not necessarily work on a Windows laptop.


There are amazing retro games that are still awesome to play to this day. To say they all suck, and it's just nostalgia is not true at all.

Sure, a lot of them suck, especially on Nintendo 64, because of the 3d transition, but from the NES onward there are timeless classics.

My kid beat super Metroid several times, he decided to play it on his own on his switch, and he loved it. He plays the old pokemon games too. In other words, that's a terrible analogy.


You're choosing the top 10 games on the Nintendo 64 and NES to make your analogy; out of the thousands and thousands of games produced for those systems. Give your kid game #50 (Waialae Country Club: True Golf Classics on N64) and see if she would prefer it over literally any modern game that ranks on Steam. My analogy holds.


Why would you compare "any modern game that ranks on Steam" with random games from the era?

You said

> It's the same phenomenon where people think their Nintendo 64 or PlayStation 2 was a masterpiece never paralleled, revisit it in 2025, and realize: "wow... this... sucks actually."

I actually tried re-playing PS games I remember enjoying, and I still enjoy them.

I see what you mean about the fact that people look at old stuff with rose-tinted glasses, but really some things did age well (including parts of the early web).


I started using proxmox a couple months ago and it's great. A really great alternative to VMware after the broadcom takeover.


My VMware continues to work the same.


"Most teams/products I have been involved in, the stack always grows to the point that a dev can no longer test it on their own machine"

Isn't this problem solved by CICD? When the developer is ready to test, they make a commit, and the pipeline deploys the code to a dev/test environment. That's how my teams have been doing it.


This turns a 1 hour task into a 1 day task. Fast feedback cycles are critical to software development.

I don't quite understand how people get into the situation where their work can't fit on their workstation. I've worked on huge projects at huge tech companies, and I could run everything on my workstation. I've worked at startups where the CI situation was passing 5% of the time and required 3 hours to run, that you can now run on your workstation in seconds. What you do is fix the stuff that doesn't fit.

The most insidious source of slowness I've encountered is tests that use test databases set to fsync = on. This severely limits parallelism and speed in a way that's difficult to diagnose; you have plenty of CPU and memory available, but the tests just aren't going very fast. (I don't remember how I stumbled upon this insight. I think I must have straced Postgres and been like "ohhhhhhhhh, of course".)


It's likely you haven't come across these use cases in your professional career, but I assure you its very common. My entire career has only seen projects where you need dozen to hundreds of CPU's in order to have a short feedback loop to verify the system works. I saw this in simple algorithms in automotive, to Advanced Driver Assistance Systems and machine learning applications.

When you are working on a software project that has 1,000 active developers checking in code daily and require a stable system build you need lots of compute.


There's a lot of folks in startups who think 100 devs is a large org and can't comprehend the scale at which '100% tests pass' stops being a build blocker. I've migrated from such an org to a late stage startup and 'tests must pass' even if fifty engineers are blocked with their PRs and the release train is fully halted. 'But our pipelines must be green' no they don't, at least not all of them.


You just need faster tests. ;)

Also, if you're booting kernel or device drivers you need the hardware. Some of this is not desktop hardware.


When you want to run the whole test suite, yes.

When you're developing and only need to touch 0.1% of the product and 0.001% of the code, that's a total and complete waste of time.


While that is a crappy thing to do, I bet tons of YouTubers are doing just that. Hell, most political YouTubers just read articles and make stupid comments about them.

It would be impossible to create daily content if you weren't just rehashing, or taking, information from somewhere. Again, not defending it at all, just saying it's probably a very common thing. Like how some crappy news articles are just a bunch of reddit comments, like that qualifies as news.


If you ever fall into "hustler-get-rich-quick" shorts/reels/tik-toks, it is full of people laying out the same exact scheme:

Make a channel

Find popular reddit/social media post

Use AI tools for text to speech

Use AI tools to generate pictures

Stitch it all together

Post on channel.


Agreed. Extra: I'd generally say, that comments on HN are often interesting and insightful (that's why we're here, no?). With the current state of social media, I'd wish for a little more HN flavor. But at least credit your source. The information you provide doesn't get less valuable only because someone else did the work.


> While that is a crappy thing to do,

I haven't watched this particular channel so maybe it's obviously shady, but I'm curious: why is this conceptually a crappy thing to do?

I mean, if you take the IP of others and redistribute it verbatim then I definitely see the ethical issue. So if the claim is that he's reading peoples' comments or posts verbatim without credit then yeah that's crappy. Don't get me wrong.

But if all we're talking about is "mining" websites like HN for topics and then creating original content that covers those topics in a different format for a different audience... where's the issue?

A few years ago I was feeling pretty burned out in the tech industry and created a tongue in cheek "luddite" channel called TechPhobe where I took an overly pessimistic view of the industry. At the time Elizabeth Holmes was on trial and a lot my videos involved me reading ArsTechnica articles on the subject (credited) while offering my personal opinions on the matter. While not successful, those videos got more views than anything else I ever created. Was that a crappy thing to do? I didn't think so at the time and I don't think so now.

I didn't stick with the channel because I realized pretty quickly that if I'm dealing with burnout the last thing I should be doing in my spare time is focusing on tech content lol


> But if all we're talking about is "mining" websites like HN for topics and then creating original content that covers those topics in a different format for a different audience... where's the issue?

Plagiarism, generally. I really enjoyed the semi-recent hbomberguy video on why it matters, and a later response (from another channel) on "The Somerset Scale of Plagiarism" for a more rigorous explanation of what the different kinds of "content reuse" can be. Those are generally where my current model of plagiarism comes from.

A specific concern would be the inaccurate telling of information that isn't understood. A video saying, "Here I will summarize this HN thread," is perfectly ok, and a good thing. A video saying, "Here I will tell you how $thing works," should be well researched and cited. Doesn't matter if the content's entirely from an HN thread for from 40 different SEO farms, it's low-quality content and it's wasting everyone's time at best, and probably actively misinforming people. (Because how true and complete is information gleaned from HN comments anyway?)


That's a good idea for how the ultra rich can pay taxes without forcing them to liquidate stock, and at least addresses the core of the problem.

But if the stock "shared" with the tax collector paid enough in dividends to cover the taxes, couldn't they just pay the taxes with the dividends to begin with?

How does sharing the stock help? Wouldn't the tax collector need to liquidate the stock to get any actually value from it?


A stock retains value even without producing dividends. Allowing wealth tax payments in the form of stock, rather than cash, addresses the liquidity issue, especially for large family-owned companies. The challenge lies in accurately valuing the stock for private companies, but that's where finance experts come in ! Furthermore, an agreement between the taxpayer and the wealth fund could accompany the transaction, including terms like holding the stock for a specified number of years, buyback preferences, or limited voting rights. At a discount on the stock price.

Given the financial and legal complexities, as well as the challenges in standardizing the process, this would only apply to payments of a highly substantial amount. But one could argue that these individuals are no longer equal to others in terms of their tax obligations, as they have some ability to negotiate to a certain extent. That's probably the main problem.

Just thinking out loud !


People considering $100,000 and even up to $200,000 rich is just idiotic, especially when considering where the person lives and if they have kids.

It can definitely be a comfortable lifestyle on a budget, but it is by no means rich. Housing costs alone makes it blatantly middle class. Add in kids which brings it own costs, and saving for college, which is basically a 2nd mortgage if you save what you're supposed to, and that income range is just getting by like everyone else

They just might have a bigger house, nicer cars, and can take better vacations. But they are busting their ass for those slight luxuries.


Do you mean $200,000 a year or in assets?

By any means, I don't see how that's "blatantly middle class", even considering housing costs. Housing costs are the same whether you're wealthy or not.

More money means more money, if you have more, you're not "getting by like everyone else". Everyone else also has to pay all those things, wealthy or not.

Also I'd argue that most people in that range aren't busting their ass more than anyone living at minimum wage.


I've been seeing a product we use at my organization roll out incomplete/trash feature fast to have a product, and then fix them after the fact.

We've gotten tons of blow back as other teams use the product and find it next to useless with tons of bugs, and I'm stuck trying to push it. Not a fun place to be.

Learned a lot about the software market and capitalism though.


For me, AI is like a documentation/Googlefu accelerant. There are so many little things that I know exactly what I want to do, but can't remember the syntax or usage.

For example, writing IaC especially for AWS, I have to look up tons of stuff. Asking AI gets me answers and examples extremely fast. If I'm learning the IaC for a new service I'll look over the AWS docs, but if I just need a quick answer/refresher, AI is much faster than going and looking it up.


This is exactly how I think of it as well.

Search is awful when you can't remember the exact term with your language/framework/technology - but highlighting code and asking AI helps out a ton.

Before, I'd search over and over fine-tuning my search until I get what I want. Tools like copilot make that fine-tuning process much shorter.


I find that for AWS IaC specifically with a high pace of releases and a ton of versions dating back more than a decade the AI answers are a great spring board but require a bit of care to avoid mixing APIs.


My experience with IaC output is that it's so broken to not only be unhelpful but actively harmful.


I learned this as a teenager when I went from great at math to terrible because I got stuck with crappy teachers. Then, in 11th grade, I got put in algebra 2 with a great teacher and was tutoring other kids.

Math is completely different than other subjects. You can't catch up by cramming or reading a book over the weekend. You have to consistently learn and use it over the years. And have competent teachers to teach it to you.

Once you get placed in the remedial math, where they are just corralling misbehaving teenagers, and slapping out worksheets so kids can pass, you are basically screwed, unless you can get out of that situation.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: