I’m very attracted to this project because of all the Linux experiences I’ve had Debian is the one that reliably works - specifically with RPi, hardware, drivers etc.
No questions but one suggestion - it would be great to ease the process of making PWAs for the Linux space. My background is designer -> JS -> node -> mucking around with C/C++, and I think this would be a great way to seed experimentation and creativity around apps - essentially hooking into the curiosity of someone chancing on the project but then being able to pick through examples, boilerplates etc and getting excited at prospect of experimenting with some core HTML/CSS/JS skills for a side project - also specifically outside of the cloying business space of android/win/iOS. The problem with Electron is you embed an entire browser with each app - perhaps some sort of version manager that links PWAs + GTK webviews?
> it would be great to ease the process of making PWAs for the Linux space
So developing for Mobian is no different than developing for Debian (in fact then I don't need to use the Modem, I just do all of my development on a Debian Machine).
So this isn't a Mobian issue per se?
EDIT: I also checked, you can install PWAs via Epihpany (which is installed by default on Mobian).
The belief in the 2010s was that UX design was these pastel-coloured care bears that care deeply about the emotional well-being of people clicking things. And the golden rule was to reduce complexity, while weirdly also hobbling control over what the business logic slurped up.
It’s absurd to think that any and all suggestions mentioned here won’t be continually coming up in feedback or testing sessions. Is it more likely that Google have negligent designers, or they design for business goals?
I have been using neural-enhance (gh:alexj) and Topaz tools to upscale PAL/NTSC artworks the last three years and would not be so judicious in describing these tools. They are hallucinating what the model assumes an upscaled image should look like and not enhancing in any way as the word is understood. The original image ceases to exist. A more honest term might be “Render As Upscaled” or “Generate Higher Resolution Image” (likewise “ML”, not “AI”).
When playing around funny things happen too: recursive upscale/sharpen and analogue artifacts begin resembling topography, molten metal etc.
Now imagine this being fitted into military drones, which it almost certainly is.
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
the onus should be upon not surrounding yourself with the latest technology - supporting old hardware and upcycling computers.
here in my city I can pull out a sticky mud-encrusted motherboard and GPU from the flea markets - scrub it down with a bit of isopropyl alcohol, and put together something as responsive as a 2015 macbook air for 20 euro (PopOS is a favourite for this).
in an agency, you have an MBP thrust upon you, and your software “works fine on [our] computers”. end users still be running a last generation of CPU will eventually decide “I need a new computer”, and the cycle is complete.
its not that developers dont care, its that they’re never given the chance to care under these ... circumstances.
There are cases of dev teams intentionally limiting themselves - Microsoft Windows Mobile OS developers having last gen devices (200 Mhz in times when flagships were 3x faster), Google web devs developing on a proxy which simulates developing country internet (low speed, high packet loss and jitter).