My paper white is about 7/8 years old, and is still holding up fine though the battery is noticeably degraded - charging it approximately once a week now.
I was also having a play with a demo model of the latest one in a store and the page turn speed is much much better, which is tempting me to upgrade though I'd prefer to run the current one into the ground first.
Aside from data consistency issues mentioned, you can also quickly get yourself into connection pool exhaustion issues, where concurrent requests have already obtained a transaction but are asking for another accidentally, then all stall holding the first open until timeouts occur.
I don't disagree, but I think there is a distinction between "everything is e2ee, but specific conversations may be MiTM without detection" and "nothing is e2ee and can be retrospectively inspected at will" that goes a little beyond security theatre - makes it more analogous to old fashioned wiretaps in my mind.
Obviously it involves trust that it isn't actually "we say it's e2ee but actually we also MiTM every conversation"
Even with closed source clients, MitMing every conversation would likely be detected by some academic soon enough - various people take memory dumps of clients etc and someone would flag it up soon enough.
I like to follow conventional commit style, and some repos I work on have CI checks for it. It's been fixed now, but for a long time the validator we were using would reject commits that included long urls in the body (for exceeding the width limit).
It was enraging - I'm trying to provide references to explain the motivation of my changes, all my prose is nicely formated, but the bulleted list of references I've provided is rejecting my commit.
I generally think it's in the category of a social problem not a technical problem - communicate the expectations but don't dogmatically enforce them
Personally I'm using haproxy for this purpose, with Lego to generate wildcard SSL certs using DNS validation on a public domain, then running coredns configured in the tailnet DNS resolvers to serve A records for internal names on a subdomain of the public one.
I've found this to work quite well, and the SSL whilst somewhat meaningless from a security pov since the traffic was already encrypted by wire guard, makes the web browser happy so still worthwhile.
> I used to throw every scrap of code onto GitHub in the vague hope of “sharing knowledge”
I looked at a random repo today, and used some of its (MIT licensed) code as a starting point.
It was an expo plugin for managing android key stores, I didn't need most of what it did, and I went a different direction in the remaining bits - but it still helped me do that quickly. That won't show up in any stats the author can see, but I appreciate their contribution
We've only raised a handful of support cases with GCP the past 5 years, but we happened to raise one this week and they've put us onto a preview feature that solves the problem we were facing - I'm suddenly wondering if we should be trying our luck with support more often instead of figuring it out ourselves.
Heh, that's my PR. Initially I thought it would be a trivial change, but then I realized I hadn't considered how it should interact with MDM / device posture functionality - these aren't features I'm personally using with the Android client, but are understandably important to enterprises.
I still hope to get back to that and try to get it to a state where it can be merged, but I need to figure out how to test the MDM parts of it properly, and ideally get a bit of guidance from the tailscale team on how it should work/is my implementation on the right track (think I had some open questions around the UI as well)
I think the interface breaking on newer screens is a key point - AOE2 definite edition looks great on a 4k screen now, but when I tried one of the other variants beforehand the UI didn't scale properly and so all the elements were tiny to the point of being unplayable without adjusting the resolution
I had a similar idea as a teenager - calculate md5 hash and store that plus a hint/offset to then brute force the original content. I had dial up and wanted a more practical way to get large files.
Anyway I emailed the Winrar developers about my idea and they politely explained why they didn't think it was feasible (appreciate they even took the time to respond!)
I was also having a play with a demo model of the latest one in a store and the page turn speed is much much better, which is tempting me to upgrade though I'd prefer to run the current one into the ground first.
reply