On Android, macOS/iOS, and Windows, this is a solved problem. Only on the extremely fragmented Linux/Posix runtimes do these problems surface.
Rust's solution is "it depends". You can use OpenSSL (system or statically compiled) or rustls (statically compiled with your own CA roots, system CA roots, or WebPKI CA roots).
I'm afraid that until the *ix operating systems come out with a new POSIX-like definition that stabilises a TLS API, regardless of whether that's the OpenSSL API, the WolfSSL API, or GnuTLS, we'll have to keep hacking around in APIs that need to be compatible with arbitrary TLS configurations. Alternatively, running applications through Waydroid/Wine will work just fine if Linux runtimes can't get their shit together.
> On Android, macOS/iOS, and Windows, this is a solved problem.
Is it, though? It is absolutely trivial for an Android app (like the one you use for banking) to pin a specific CA or even a specific server certificate, and as far as I'm aware it is pretty much impossible to universally override this.
In fact, by default Android apps don't accept any user-installed certs. It uses separate stores for system-installed CA roots and user-installed CA roots, and since Android 7.0 the default is to only include the system-installed store. Apps have to explicitly opt-in to trusting the user-installed store.
Are you sure? It's been a few years, but last I tried Firefox used its own CA store on Windows. I'm pretty sure openjdk uses "<JAVA_HOME>/jre/lib/security/cacerts" instead of the system store too.
Is it solved in macOS? Curl recently removed macOS keychain support as there are like 7 competing APIs 6 of which are deprecated and number 6 is a complete HTTP replacement so curl can't use it.
Only reason why it works on macOS curl is because they're a few versions behind
That last part does sound like a bad deal based on recent anti-owner-control habits like sealed immutable system volumes, but I definitely want to be constrained to a single system cert store controlled by the owner of a computer. Which works for the corporate case as well as the personal one.
I can't say I'm shocked. Disappointed, maybe, but it's hardly surprising to see the sociopathic nature in the people fighting tooth and nail for the validation of venture capitalists who will not be happy until they own every single cent on earth.
There are good people everywhere, but bring good and ethical stands in the way of making money, so most of the good people lose out in the end.
AI is the perfect technology for those who see people as complaining cogs in an economic machine. The current AI bubble is the first major advancement where these people go mask off; when people unapologetically started trying to replace basic art and culture with "efficient" machines, people started noticing.
> Windows used to be "just post the exe on your web site, and you're good to go."
That's also one of the main reasons why Windows was such a malware-ridden hellspace. Microsoft went the Apple route to security and it worked out.
At least Microsoft doesn't require you to dismiss the popup, open the system settings, click the "run anyway" button, and enter a password to run an unsigned executable. Just clicking "more details -> run anyway" still exists on the SmartScreen popup, even if they've hidden it well.
Despite Microsoft's best attempts, macOS still beats Windows when it comes to terribleness for running an executable.
I just wish these companies could solve the malware problem in a way that doesn't always involve inserting themselves as gatekeepers over what the user runs or doesn't run on the user's computer. I don't want any kind of ongoing relationship with my OS vendor once I buy their product, let alone have them decide for me what I can and cannot run.
I don't think Safari mattered much. Java was still used for things that wouldn't work on phones without massive redesigns anyway.
I doubt you'd have been able to bootstrap Runescape in any form, even rewritten in native code, on the first iPhone to support apps. Applets worked fine on desktops and tablets which was what they were designed for.
Browser vendors killed the API because when they looked at crashes, freezes, and performance opportunities, the Flash/Java/etc. API kept standing out. Multithreaded rendering became practical only after the old extension model was refactorerd and even then browsers were held down by the terrible plugin implementations they needed to work around.
Apple was the first to publicly call out native plugins (jobs did so on stage) and outright refused to support them on iOS, then everyone else followed suit.
NPAPI's death in non-IE-browsers started around 2015. Jobs announcing mobile Safari without Flash was 2010. Unfortunately, ActiveX still works to this very day.
Chrome built up a whole new PPAPI to support reasonably fast Flash support after the Jobs announcement. Microsoft launched a major release of Silverlight long after Jobs' speech, but Silverlight (rightfully) died with Windows Phone, which it was the main UI platform for around its practical death. Had Microsoft managed to launch a decent mobile operating system, we'd probably still be running Silverlight in some fashion today. Even still, Silverlight lasted well until 2021 before Silverlight actually fell out of support.
Jobs may have had a hand in the death of Flash websites, but when it came to Java Applets/Silverlight, the decision had little impact. That plugin model was slowly dying on its own already.
There was a Flash runtime on Android. It was terrible. Java applets were already dead anyway, outside of professional contexts, which are not relevant on phones anyway.
Takes a few seconds longer to load because it loads all of Java Spring, but it still performs just fine on my phone (though the lack of on screen keyboard activation makes it rather unfortunate for use in modern web apps).
The alternatives to Java were just as bad. Flash and friends were fast but couldn't do anything more complicated than animation for most of its life. In the Java heydays, you were doing either Java or custom ActiveX plugins, and both led to security popups galore and random browser freezes.
However, ActiveX usually required you to install components, while Java could just run first time.
ActiveX was its own special kind of terrible for many reasons, but so were Java, Flash, and Silverlight. At least ActiveX didn't hide the fact you were about to grant arbitrary code execution to a website, because you might as well have assumed that the second these plugins were loaded.
The only advantage to Java applets I can think of is that they had the advantage of freezing the browser so it could no longer be hacked.
The Java applet system was designed better than ActiveX but in practice I've always found it to be so much worse of an end user experience. This probably had to do with the fact most ActiveX components were rather small integrations rather than (badly fitted) full-page UIs.
Flash is still a big loss imho, the ecosystem of games, movies and demonstration thingies was amazing and they were accessible to create by many. Unlike Java applets that slowed the main browser UI thread to a crawl if they didn't load they usually didn't), Flash didn't have such slowdowns.
One exception is early 2000s Runescape: that was Java in browser but always loaded, no gray screen and hanging browser. They knew what they were doing.
Many of the old games and movies still play back well with Ruffle installed (https://ruffle.rs/). Newgrounds embeds it by default for old interactive flash media that they couldn't convert directly to video.
It's not a perfect fit, but it works. The speed of Ruffle loading on a page is similar to that of Flash initializing, so you can arguably still make flash websites and animations to get the old look and feel if you stick to the Ruffle compatibility range. The half-to-one-second page freeze that was the norm now feels wrong, though, so maybe it's not the best idea to put Flash components everywhere like we used to do.
Runescape proved that Java could be a pretty decent system, but so many inexperienced/bad Java developers killed the ecosystem. The same is true on the backend, where Java still suffers from the reputation the Java 7 monolithic mega projects left behind.
It's good that we have the runtime to run old Flash games. What we lost is an extremely easy environment for authoring/creating them. Nothing has come even close since Flash. Not just game, but any kind of interactions and animations on the web.
I think perhaps what was lost is mostly this: Macromedia. They had a knack for making content creation simple. Flash was just one of the results of this: It let people create seemingly-performant, potentially-interactive content that ran almost universally on the end-user computers of the time -- and do it with relative ease because the creation tools existed and were approachable.
Macromedia also provided direction, focus, and marketing; more of the things that allowed Flash to reach saturation.
Someone could certainly come up with an open JS stack that accomplishes some of the same things in a browser on a modern pocket supercomputer. And countless people certainly have.
But without forces like marketing to drive cohesion, simplicity, and adoption then none of them can reach similar saturation.
In my experience, most of the more important Java-on-the-web stuff was Java Web Start as opposed to applets. And Java Web Start was all kinds of bad. About the only remotely good thing I could say is that it has a sandbox. Which protected no one from anything, by design, because it was the app’s choice whether to use it. And Web Start apps also often included native code, too, so they weren’t even portable.
So much JWS PTSD. Industrial automation got a giant dose of it and those things somehow weren’t even portable to different versions of internet explorer.
Explorer freezing halfway through copying happens all the time for me, usually it means Windows' I/O buffer is full and the drive is taking its sweet time actually doing data transfers. Windows will happily show you gigabytes per second being copied to a USB 2.0 drive if your RAM is empty enough, but it'll hang when it tries to flush.
Sometimes it's interference, sometimes the backing SSD is just a lot slower than it says on the box. I've also seen large file transfers (hundreds of gigabytes) expose bad RAM as caches would get filled and cleared over and over again.
You should be able to go into the Windows settings and reduce the drive cache. Copying will be slower, but behaviour will be more predictable.
The download being cached in RAM kind of makes sense, curl will do the same (up to a point) if the output stream is slower than the download itself. For a scripting language, I think it makes sense. Microsoft deciding to alias wget to Invoke-WebRequest does make for a rather annoying side effect, but perhaps it was to be expected as all of their aliases for GNU tools are poor replacements.
I tried to look into the whole Expand-Archive thing, but as of https://github.com/PowerShell/Microsoft.PowerShell.Archive/c... I can't even find the Expand-Archive cmdlet source code anymore. The archive files themselves seem to have "expand" be unimplemented. Unless they moved the expand command to another repo for some reason, it looks like the entire command will disappear at one point?
Still, it does look like Expand-Archive was using the plain old System.IO.Compression library for its file I/O, though, although there is a bit of pre-processing to validate paths existing and such, that may take a while.
> curl will do the same (up to a point) if the output stream is slower than the download itself
That "up to a point" is crucial. Storing chunks in memory up to some max size as you wait for them to be written to disk makes complete sense. Buffering the entire download in memory before writing to disk at the end doesn't make sense at all.
curl's approach will lead to partial and failed downloads. When a client stops accepting new data, servers tend to close the connection after a while.
There are smoother ways to deal with this (i.e. reduce download rate by faking dropped packets to output speed), but if you just want a simple download command, I think both simple solutions are fine.
If the download doesn't fit in RAM, it'll end up swapped out and effectively cached to disk anyway.
The standard solution to this is to write the download to a temporary hidden file on the same volume and then rename it into place once the download succeeds (or delete it on failure).
That's true when downloading to a file, but Invoke-WebRequest is more curl-like than wget-like. It's designed to return an object/struct rather than simply download a file.
If you want to download many/large files, you're probably better off with Start-BitsTransfer.
I don't think it's included by default but the font itself will just work once you install it.
As for open fonts (can fonts even be truly closed in the first place?), Times New Roman is just as closed and proprietary as Calibri is.
reply