Hacker Newsnew | past | comments | ask | show | jobs | submit | ldev's commentslogin

Install git without requiring root privileges... Also git is a 3rd party application, yet somehow all distros treat git as part of base system/OS bundle/whatever, so Fedora 25 has set version of 2.9, 24 of 2.7, etc. WHY. WHY. It's third party software, not part of your god damn OS - there should be only one version of git in the repos for ALL FEDORA VERSIONS - and that would be the latest... Come the fuck on.

Replace "git" with every other package....

Linux is so amateur hour with all plainly stupid decisions, I'd rather pay good money for proper unix.


If you want the latest and greatest of everything, you could use Arch Linux.

Git today in many ways could be seen as fundamental component of many tools. Many packaging and build tools use it to fetch data. (like homebrew, plugin systems for many text editors etc.)

Also, the git version of your distribution IS relevant because other packages depend on it. For example, on my Ubuntu system the git package is a dependency of over 170 other packages. If you could install a newer version, a lot of these other packages might break.

The recommended way of installing git on macOS is via Apples git variant by installing Xcode which also requires root privileges btw.


If you grab git from Homebrew, it doesn't overwrite the Xcode one, so if you depend on that you're good. How hard could this possibly be for apt-get and the like?


to be fair, there is LinuxBrew[1]; also, apparently you can use nix to the same effect.

I do believe that Linux-based desktop OSs should separate base system from user software, kind of like *BSDs have been doing; actually, I'd really distros to embrace something like homebrew, where packages are installed per-user

[1] http://linuxbrew.sh/


> If you want the latest and greatest of everything, you could use Arch Linux.

You're kidding me, right? https://git.archlinux.org/svntogit/community.git/log/trunk?h...

There's sometimes weeks before node is updated.

Updating PostgreSQL from 9.6.1 to 9.6.2 took 4 months.

And right now .net core build is failing on my archlinux test server.

---

If you want latest - use macOS with brew


If you need the latest versions so badly, why don't you just compile from source instead of waiting 4 months?


thanks package manager, you're useless


Your criticisms of Linux are basically that there is a proper security/stability process in place.


0.2% desktop users agree. Continue live in your strange bubble, just stop mentioning linux to normal folk.


I didn't realise normal folk came to HN, but your advice is noted.


You know who wrote git, right?


[flagged]


[flagged]


[flagged]


Please stop posting like this here.

https://news.ycombinator.com/newsguidelines.html


Because you can upgrade from windows 3.11 all the way to 10, good luck with ubuntu.

Yeah, I see your point.


I have actual machines that I was able to flawlessly upgrade from 4.10 to 17.04. Can't say that about Windows. Though I did actually know what I shouldn't mess with if I want the upgrade to go through (or fix before).


You got extremely lucky. Not that windows is much better, but in general there are hiccups to upgrading with most linux distros as well as windows.


I hit problems with the desktop environment on both 8->10 and 10->12, then gave up trying to upgrade Ubuntu in-place. Debian has been absolutely fine, though.


> Because you can upgrade from windows 3.11 all the way to 10

That has never ever worked for me until the 7 -> 10 upgrade: 95 -> 98, 98 -> 2k, 2k -> XP and XP -> 7 all failed miserably and required installing from scratch.

And even 7 -> 10 failed without any helpful information= the first few times I tried it (unlike OSX where I've been upgrading and migrating the same system for more than 10 years across multiple machines and versions).


This is more like 7 -> 10 (14.04 is LTS predecessor to 16.04). Totally valid upgrade path.


There are people who have done it all of the way from Windows 1.


No, you were blinded by your own hygiene... Using contact lenses for 7 years now.


You're either trolling, or you didn't think this through. People can do all kinds of potentially dangerous things for an extended period of time without negative consequences (heroin, football, diet pills, etc). Extrapolating anything based solely on your own survival is weird.

Moreover, the infection being discussed in this article is incredibly rare - and contact lenses are generally quite safe - so the fact that neither you nor anyone else commenting here has been exposed to it shouldn't be surprising. That you've made it 7 whole years without going blind isn't some vindication of your own hygienic patterns, or an indictment of hers.


> 25 yrs

Is is really worth to butcher a word to save one letter?


Not really news, saying "linux distro chases ideas to rewrite instead of sticking to one thing and fixing it and that's why linux desktop won't ever happen" is like saying "grass is green".


Because we solve practical problems, and do not nit pick of what is technically correct. We are not drones and easily understand that in this context it's megabytes.

Just look at all that "technically it's mebibytes bla bla bla" in replies. No one cares. Write some code. Or better - go outside.


space launches have crashed because of confusion over standard units.

in that case it was confusion between metric and certain fantasy engineering units, but an error of 1000/1024 will cause troubles just as badly.

so with that attitude maybe don't write that code, and better stay inside or a rocket might fall on your head.

but for serious, that correction probably has taught more than 10 people the difference between uppercase B = bytes, lowercase b = bits, uppercase M = mega = 1 000 000, lowercase m = milli, MiB = mebibyte = 1024 x 1024 bytes = 1 048 576 bytes, or at least made them aware of the important fact there is a difference. while your pedantry about nitpicking has taught nobody anything except to always be alert cause there's people like you that like to offload mental ballast and use wrong units because they insist their errors can be inferred and corrected from context... which is an important lesson also, but as a warning, not to defend the behaviour.


This. Thank you.


Original comment sounds more like google's shill comment, to be honest. Because iPhones take pictures instantly too (although they have a quick ui feedback), and it's either hard to believe that attendant never touched iPhone, which is, like, well, the most popular phone ever; or it's indeed that pixel doesn't indicate in any way that it has taken the picture.


> X11 runs really well these days -- responsive, fast, reliable

Well, you haven't seen Windows then - the graphics stack is phenomenal and a marvel of engineering. nVidia drivers crash? I only get a second of black screen and then resume my work. Yep, that's right - no other GUI program crashed, I didn't had to do anything, literally just 1 second of black screen.

Oh and you can have one window on two monitors and both parts of window will have full vsync - insane, huh? :)

It's scary how good Windows is.


I don't much about graphic stacks, but in my experience the best, as least from the perspective of someone using multiple HiDPI displays macOS is far and away the best. Many windows applications don't scale properly or if they do they require special settings to do so. If you have monitors with different levels of scaling your going to have a terrible time on windows.


On the other hand, connecting a 4k display in windows will default to configure it at 200% scaling, in mac it defaults to render everything tiny as ants.


I think it's only 150%, currently running windows on a 28" 4k monitor makes me think 200% would be too much?

Oh I should also add that I haven't ever had any of the DPI issues the parent's parent is referencing. The only problem with multiple display DPI in Windows 10 is the shockingly bad fuzz you get on your secondary display from the thing being rendered either smaller or larger than normal (depending on whether the 4k is your primary or secondary) and then scaled up or down to fit the monitor.


It may somehow be detecting the size of the display. My 13" laptop likes 250% scaling, while my 27" monitor likes 150%.


That would make sense, your DPI scaling is all to do with readability after all :)


Windows 10 refuses to work properly with my 4K Dell monitor.

After it goes to sleep, then I wake it up all open Windows have been resized into a tiny part of the screen and scaling goes weird.

https://duckduckgo.com/?q=windows+4k+monitor+resize+after+sl...

Terrible.

Having said that on macOS my external 5K LG monitor is causing complete system crashes now and again :(


I have a 43" 4k screen. It's near impossible to make Windows NOT use scaling, even if you disable it everywhere, the next Windows updates usually reset your carefully created registry hacks.

I wish Windows based scaling on DPI instead of resolution, the system seems to be aware of both. On a more general level, I wish there was any hope of passing feedback to Microsoft/Apple/etc.


>... in mac it defaults to render everything tiny as ants.

I'm a fan of no DPI scaling (100%) at 4K, at least on my 27" monitor. It takes 2-3 months of getting used to, but once your brain and eyes adapt, significantly lower dot pitches become completely unusable. The only thing I change is bumping up my terminal or editor's default font size a tad.

That said, I'm not sure how people with 24" 4K monitors do it without DPI scaling. I'd probably even prefer 30" myself.


>...significantly lower dot pitches become completely unusable.

Correction/clarification: significantly higher* dot pitches, as in lower pixel density. "Completely unusable" was meant in the sense of how it'd feel to return to 800x600 after being accustomed to 1080p. 4K is four times 1080p, so it's roughly comparable.

It wasn't my intention to offend anyone with poor eyesight, or suggest that people ruin theirs. Just that it's possible to get used to really low (dense) dot pitches, and once you do it's simultaneously really enjoyable and weird at the same time.


I once used 15" MBPr without DPI scaling (2880x1440 native res) for a while. Then I became too worried about my eyes.


As someone who remembers NT4 video driver crashes bringing down the whole system, when I saw Windows 10 recover from a video driver (yes, nVidia) crash I was massively impressed.

There's always room to improve but I'm really happy with the stability improvements they've put in.


FWIW my graphics drivers crashes on Ubuntu always recover. Even a complete GPU hang. It sees the driver is hung and restarts it causing a few seconds of minor glitching.

Configuring X is not for the faint of heart, but in Unity it is basically magical and deals with HiDPI displays, etc. just fine. Feature for feature it is very similar to Win or Mac on the display and GPU driver front. The total package still feels rougher around the edges, but it is still good.


My only issue with configuring X is that 'load "glx"' will not fetch the correct libglx.so with nvidia drivers loaded unless you tell it to go looking in /usr/lib/nvidia-[VERSION]/ :| took me a very very long time to get GLX working on my setup.


Yes, but only if you have the Aero compositor turned on... which adds at least one frame of latency to everything. Without Aero, Chrome can't play video without tearing like crazy. Oh, and that white pixel in the top left corner of the screen if you've got Aero turned on but turned off the annoying as crap transparent windows. Yeah, scary good...


Yes, and it only took 30 years, too!


Windows' stack is still a ways from being reliable in a lot of somewhat important cases like (real) fullscreen, but it's certainly miles beyond X11 and Linux.

(I don't have enough experience with Mac's to speak to it).


I think that's a case of necessity being the mother of invention. IME the windows 10 drivers crash much more frequently than the linux or windows 7 ones ever did.


I don't think it's fair to blame the OS for driver crashes. Those are a result of third parties and can happen on any platform.


That varies on the OS/driver/hardware, but IMO, the new AMD graphics drivers being in the kernel tree are the way to go, or something similar like a partnership between MS/AMD/Nvidia.

As a consumer it's incredibly frustrating to have a buggy driver and not know who is responsible. Is it MS? Windows comes with a lot of drivers so blaming MS seems fair. Is it the hardware manufacturer? Sometimes you can get the latest drivers but the OEM hardware isn't quite standard so you're screwed. Is the OEM to blame? Usually, because they have their own driver update system, but then the question is why can't they use the native windows update system?

The current situation on windows seems to be that no one is responsible.


> then the question is why can't they use the native windows update system?

Cynical answer, and I will grant that gpu vendors are less bad than e.g. printer or smartphone drivers, but it just distributes drivers and so doesn't provide all the opportunities to upsell/advertise to/lock in the users that their bundled crap they can pair with the driver with their own installer allows


Many vendors do use the Windows Update system. Looking at my old Windows 10 box, in the last three months it's received display driver updates from Intel and nVidia (it's an Optimus system so yay, twice the driver update joy). It's also got a mouse driver update from something called ELAN.

Also, Windows has this thing called minidrivers where Microsoft essentially writes a chunk of your driver for you (the generic chunk), and you only have to write the bits specific to your device. The idea is that Microsoft could QA their drivers better than J. Random OEM ever could, and so this'd reduce cost for OEMs and also make the Windows platform more stable.

https://msdn.microsoft.com/en-us/windows/hardware/drivers/ge...


You're making this harder than it needs to be. All Windows drivers are signed. Blame the party that signed the buggy driver.


The point is, users will blame windows because they don't know what a graphics driver is or that they have one, so MS took steps to avoid people thinking windows is buggy. MS has a long history of this, going as far as reproducing bugs.


You sign an executable to attest that it is authentic, not that it is bug-free.


> Not performing testing on Linux is inexcusable in my book in this day and age.

No, wasting time for two open minded users instead of working for 99,999% users is inexcusable.


Give him another 17 years and he might learn some basics.


I must write such an article on my blog. Some clickable title like "37 things I have learned during the last 37 years"...

The list would be quite short:

- use Postgres - don't irritate your readers (e.g. with autoplay videos, sounds, making accounts to read anything) - make the page lite (with terribly small usage of javascript)

(then repeat the above 12 times) and the final thing would be:

- just read some books about programming, learn KISS, YAGNI, DI, OFA (other funny abbreviations) - and use it


- set aside a time after every iOS release to retest the frontend because apple will have broken some part of the standard in amusing and hilarious ways.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: