So is the “binary” nature of today’s switches the core objection? We routinely simulate non-binary, continuous, and probabilistic systems using binary hardware. Neuroscientific models, fluid solvers, analog circuit simulators, etc., all run on the same “binary switches,” and produce behavior that cannot meaningfully be described as binary, only the substrate is.
I was shocked to discover how incredibly poorly IndexedDB works. I always thought it would be fast and snappy if a bit alien. But nope, it's incredibly bad!
Despite being a native feature to the browser it's incredibly slow, and the way it works in terms of fetching records based on non-primary keys forces you to either load your entire dataset into RAM at once or iterate though it record-by-record in a slow callback. Something as trivial as 10k records can bring your webapp to a crawl.
I've built some pretty intensive stuff in indexeddb and it was the only thing I've ever done, using native browser features, that I could get to consistently crash the browsers I tested it on (granted, this was many years ago). On top of that, the API is so ugly. I cannot believe indexeddb won over websql (when every browser ever already embeds sqlite). What a shame.
There are several "American" cars interesting for our market they talk about when they talk about importing American cars (ex. Toyotas) it's usually not the kind of car you Americans think about, and not much to worry for us ...
I think companies always prioritized their own interests.
A company can increase its profits (1) by improving their products and services, so that they'll get more customers or customers willing to pay more, or (2) by increasing how much of their revenue is profit by (e.g.) cutting corners on quality or raising prices or selling customers' personal information to third parties.
Either of those can work. Yes, a noble idealistic company might choose #1 over #2 out of virtue, but I think that if most companies picked #1 in the past it's because they thought they'd get richer that way.
I think what's happened is that for some reason #2 has become easier or more profitable, relative to #1, over time. Or maybe it used not to be so clearly understood that #2 was a live option, and #1 seemed safer, but now everyone knows that you can get away with #2 so they do that.
We can have stable user-friendly software. We had a nice sweet spot in the early 2000s with Windows XP and Mac OS X: stable operating systems built on workstation-quality kernels (NT and Mach/BSD, respectively), and a userland that respected the user by providing distraction-free experiences and not trying to upsell the user. Users of workstations already experienced this in the 1990s (NeXT, Sun, SGI, HP, and PCs running IBM OS/2 Windows NT), but it wasn’t until the 2000s when workstation-grade operating systems became readily available to home users, with both Windows XP and Mac OS X 10.0 being released in 2001.
We do of course still have this in modern computing with Linux/KDE. Stable, snappy, and does exactly what you ask. The computer doesn't get in your way, nor does it try to get you to do something else. It just does what you tell it to do, immediately.
Yup, desktop Linux and other FOSS systems like ReactOS and Haiku are the last bastions of personal computing that haven’t been made into platforms that nag and upsell us.
> first version of Linux - which I did not have the honor to use but I can imagine how user friendly it was
My first accounts were on Linux 1.x. It was glorious. Simple, sensible, and with manuals one command away. And it allowed you to just get things done. And there were tools. So many tools. 80's home computers and DOS crap and Macs that couldn't even open a file if it hadn't been tagged as the property of some application... Hells I would never have to be a part of any more. Except for work and school. But for personal computing, a brighter future was coming. In 30+ years since I've never had to step away.
It's not impossible that an AI was asked to sprinkle in a few typos for effect, but perhaps it really is just written by a person who really loves emojis.
What an absurd take. If we use FLOPS as a crude measure, the Air would be comparable to the leading supercomputers of ~1999/2000. There's many reasons why that's a very poor comparison but ignoring the absolute insanity of the raw compute available in a pocketable, thin, battery-powered handheld that you can buy literally this week, is ridiculous. Modern smartphones are nothing short of sci-fi when compared to even recent living memory. We're simply used to them due to their sheer ubiquity.
The A19 GPU doesn't even have hardware support for FP64, which is the precision used for TOP500. No, it is not comparable to leading supercomputers of 1999/2000.
It's easy to say 'write tests', but it's difficult to do so if you don't know what the actual requirements are.
People often think they are doing it 'properly' now by starting simple, but as they learn more and add functionality, they end up with the same complex mess they wanted to avoid.
I agree that it is a mostly well managed product, but I can think of a lot of things when it was in the news for something bad.
Most controversial is probably the increase in the amount of Ads, unskippable ads, then there was multiple problems with Youtube kids, e.g. how bad people get really bad videos there. There was an outcry when the dislike button was removed, and so on..