Hacker Newsnew | past | comments | ask | show | jobs | submit | atlantis_'s commentslogin

I’m very attracted to this project because of all the Linux experiences I’ve had Debian is the one that reliably works - specifically with RPi, hardware, drivers etc.


Isn't the point of open source to contribute rather than complain about what others haven't done?


It is getting better though - check out libcamera.


What's to check out there wrt kernel drivers?


No questions but one suggestion - it would be great to ease the process of making PWAs for the Linux space. My background is designer -> JS -> node -> mucking around with C/C++, and I think this would be a great way to seed experimentation and creativity around apps - essentially hooking into the curiosity of someone chancing on the project but then being able to pick through examples, boilerplates etc and getting excited at prospect of experimenting with some core HTML/CSS/JS skills for a side project - also specifically outside of the cloying business space of android/win/iOS. The problem with Electron is you embed an entire browser with each app - perhaps some sort of version manager that links PWAs + GTK webviews?


You can write native applications for GTK in any language you want. Most of the time they should be much less memory hungry than running a browser.


> it would be great to ease the process of making PWAs for the Linux space

So developing for Mobian is no different than developing for Debian (in fact then I don't need to use the Modem, I just do all of my development on a Debian Machine).

So this isn't a Mobian issue per se?

EDIT: I also checked, you can install PWAs via Epihpany (which is installed by default on Mobian).


The belief in the 2010s was that UX design was these pastel-coloured care bears that care deeply about the emotional well-being of people clicking things. And the golden rule was to reduce complexity, while weirdly also hobbling control over what the business logic slurped up.


It’s absurd to think that any and all suggestions mentioned here won’t be continually coming up in feedback or testing sessions. Is it more likely that Google have negligent designers, or they design for business goals?


I have been using neural-enhance (gh:alexj) and Topaz tools to upscale PAL/NTSC artworks the last three years and would not be so judicious in describing these tools. They are hallucinating what the model assumes an upscaled image should look like and not enhancing in any way as the word is understood. The original image ceases to exist. A more honest term might be “Render As Upscaled” or “Generate Higher Resolution Image” (likewise “ML”, not “AI”).

When playing around funny things happen too: recursive upscale/sharpen and analogue artifacts begin resembling topography, molten metal etc.

Now imagine this being fitted into military drones, which it almost certainly is.


Am I missing something? The innovation is to ... dim LEDs so that full brightness is... more than 100%...

https://youtu.be/4xgx4k83zzc


HDR specifies a brightness. Apple is playing HDR videos at the proper brightness when the rest of the OS might be turned down.

https://appleinsider.com/articles/20/08/03/what-hdr-hdr10-an...

That plus P3 gamut means a video playing on a display is closer to what a filmmaker intended.

https://en.wikipedia.org/wiki/DCI-P3

What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.


You can dim the whole OS expect the video being played. This can’t be done anywhere else at this point.

That video can have a very bright sky, for instance. You can have a bright sky and a blinding text box or neither in other OSs.

It’s also a very bright display in its own right, with 1600 nits vs the 300-400 of a regular one. And 1,000,000:1 contrast as well.


the onus should be upon not surrounding yourself with the latest technology - supporting old hardware and upcycling computers.

here in my city I can pull out a sticky mud-encrusted motherboard and GPU from the flea markets - scrub it down with a bit of isopropyl alcohol, and put together something as responsive as a 2015 macbook air for 20 euro (PopOS is a favourite for this).

in an agency, you have an MBP thrust upon you, and your software “works fine on [our] computers”. end users still be running a last generation of CPU will eventually decide “I need a new computer”, and the cycle is complete.

its not that developers dont care, its that they’re never given the chance to care under these ... circumstances.


There are cases of dev teams intentionally limiting themselves - Microsoft Windows Mobile OS developers having last gen devices (200 Mhz in times when flagships were 3x faster), Google web devs developing on a proxy which simulates developing country internet (low speed, high packet loss and jitter).


But do people want this? Do they have choice?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: