Hacker Newsnew | past | comments | ask | show | jobs | submit | LargoLasskhyfv's commentslogin

Depends heavily on the used hardware AND firmware of the system. I remember having some no-name laptop with a P166-MMX, and Bios from Systemsoft. That thing managed to successfully suspend and resume anything. Be it just to and from RAM, or to Disk in a separate small partition. By anything I mean exotics like NetBSD, FreeBSD, OpenBSD, any Linux I threw at it.

It isn't on Lenovo ThinkCentres like M910q tiny. Which has integrated Intel HD630 graphics. Works every single time.

CachyOS just works for me. Highly optimized Arch working flawless and without hassle.

I know my ways around Arch, and in the about two years using CachyOS I never needed to intervene, with the exception of things like changed configs/split packages. But those are announced in advance on their webpages, be it Arch itself, or CachyOS, and also appear in good old Pacman in the terminal, or whichever frontend you fancy. It's THE DREAM!

What's lacking is maybe pre-packaged llm/machine learning stuff. Maybe I'm stupid, but they don't even have current llama.cpp, WTF? But at least Ollama is there. LM-Studio also has to be compiled by yourself, either via the AUR, or otherwise. But these are my only complaints.


  > Maybe I'm stupid, but they don't even have current llama.cpp, WTF?
I don't understand. It's in the AUR...

https://aur.archlinux.org/packages/llama.cpp

  > has to be compiled by yourself, either via the AUR
I don't think I'd call the AUR "compiled by yourself". It's still a package manager. You're not running the config and make commands yourself. I mean what do you want? A precompiled binary? That doesn't work very well for something like llama.cpp. You'd have to deliver a lot more with it and pin the versions of the dependencies, which will definitely result in lost performance.

Is running `yay -S llama.cpp` really that big of a deal? You're not intervening in any way different for any other package (that also aren't precompiled binaries)


Bad is relative. I usually don't bother with the MPV/yt-dlp combo (except for rare local downloading for backup/convenience purposes) and just let it play in the browser. Not being logged in, because I have no google account anymore. It's smooth, and plays instantly when opened in a new tab. I let it have its cookies, and don't erase them, so I get the content I like, mostly. For things I'm unsure how they'd affect the algorithm, or if they are AI-slopped music, I'm just opening them in a private window. Works for me with just uBO and some additional list subscriptons in there.

How loud will it be?

I'd like to believe, but I wont.

Clickbait.


Mulder finally retired.

Nope, not really ;-)

It's just that when this came up shortly after april the first this year, there was nothing to it, only circular references, supported by nice sloppy ai-gfx, and nothing else. It was an assumption/speculation, uttered by someone, endlessly repeated and overamplified.

I don't see any reason why it should be different this time.

Beggars for clicks.



Bandsalat

Great name for a band

Every time I read about D|S-RAM scaling I'm reminded of https://www.besang.com/

Ever heard of them? What do you think? Vaporware?


Yah. Maybe. OTOH most places are easily walkable, bicyclable, and if you don't want to do that there is usable, affordable and working public transportation everywhere. No 'food deserts', because no need to drive for (dozens of) miles for groceries. If it's your thing, there is weed. Though not as good and varied as in the 90ies. Oh, and Armin van Buren!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: