Because none of the existing software would work. The idea of running a Rosetta-like feature on an 8-bit CPU isn't feasible. The Apple II eventually received an upgraded processor, the 65816, which was compatible with the 6502.
Though a problem, as you point out, it still happened. The 6800 based SWTPC was followed by 6809 machines what need to have all their software reassembled.
On the other side of the cpu wars, all those 8080 machines moving on the Z80s got to keep all their binary software, which happened again for IBM PCs and clones as those evolved.
The Java runtime isn't any more inherently insecure than the JavaScript runtime, and JavaScript seems to work just fine for the web.
The key reason why applet security failed was because it gave you the entire JDK by default, and so every method in the JDK needed to have explicit security checking code in place to restrict access. The model was backwards -- full control by default with selective disabling meant that every new feature in the JDK is a new vulnerability.
Just look up "Java applet sandbox escape". There were tons of ways to do it. Here are some [0]. Then there's the coarse-grained permissions that were essentially useless to begin with.
Yes, I'm familiar with these. Many of the earliest problems were to due bugs in the verifier, and there were several different vendors with their own set of bugs. The bulk of these problems were identified and resolved over 25 years ago.
Most of the later problems are due to the fact that the API attack surface was too large, because of the backwards SecurityManager design. And because it existed, it seems there was little incentive to do something better.
Once the instrumentation API was introduced (Java 5), it made it easier to write agents which could limit access to APIs using an "allow" approach rather than the awful rules imposed by the SecurityManager. Java 9 introduced modules, further hardening the boundaries between trusted and untrusted code. It was at this point the SecurityManager should have been officially deprecated, instead of waiting four more years.
Going back to the earlier comment, the problem isn't due to the runtime being somehow inherently insecure, but instead due to the defective design of the SecurityManager. It hasn't been necessary for providing security for many years.
Sometimes I'd like to have unsigned types too, but supporting it would actually make things more complicated overall. The main problem is the interaction between signed and unsigned types. If you call a method which returns an unsigned int, how do you safely pass it to a method which accepts a signed int? Or vice versa?
Having more type conversion headaches is a worse problem than having to use `& 0xff` masks when doing less-common, low-level operations.
This adds an extra level of friction that doesn't happen when the set of primitive types is small and simple. When everyone agrees what an int is, it can be freely passed around without having to perform special conversions and deal with errors.
When trying to adapt a long to an int, the usual pattern is to overload the necessary methods to work with longs. Following the same pattern for uint/int conversions, the safe option is to work with longs, since it eliminates the possibility of having any conversion errors.
Now if we're taking about signed and unsigned 64-bit values, there's no 128-bit value to upgrade to. Personally, I've never had this issue considering that 63 bits of integer precision is massive. Unsigned longs don't seem that critical.
I think the only answer would be you can’t interact directly with signed stuff. “new uint(42)” or “ulong.valueOf(795364)” or “myUValue.tryToInt()” or something.
Of course if you’re gonna have that much friction it becomes questionable how useful the whole thing is.
It’s just my personal pain point. Like I said I haven’t had to do it much but when I have it’s about the most frustrating thing I’ve ever done in Java.
The first official JIT became available in JDK 1.1, in 1997. The Symantec JIT was available as an add-on sometime in mid 1996, just a few months after JDK 1.0 was released. Even better performance was possible with GCJ, available in 1998.
The release of HotSpot was in 1999, and became default with JDK 1.3 in 2000. It took JIT compilation to the next level, making tools like GCJ mostly obsolete.
Do Java problems go away? I thought the selling point was that your huge un-rewritable enterprise software will crash tomorrow like it crashed yesterday.
Here's a talk from Netflix (hopefully sufficient enterprise for the discussion) that goes over how a JDK version upgrade to generational ZGC improved a bunch of their request timeouts: https://youtu.be/XpunFFS-n8I?si=XG6zYYZy50sfNE4j
At the time the feature was added, there was no way to make a parameter to a function be lazily evaluated. Something like `assert(condition, "error: " + stuff)` would eagerly concatenate the string even when the condition is always true (which it should be). Nowadays, the error parameter can be specified as a lambda, which can potentially be optimized to be just as cheap as the existing assert feature.
Directly or indirectly many (or most) projects ended up depending on something which was using an unsupported backdoor API because it provided a marginally useful capability. The module system restricted access to these APIs and everything stopped working, unless you added some magic command line arguments to gain access again.
So for most people, the initial impression of modules is negative, and then they just decided to rule the feature out completely. This has created a sea of useless criticism, and any constructive criticism is hardly observed. Improvements to module configuration (combine it with the classpath), would go a long way towards making modules "just work" without the naysayers getting in the way.
>Directly or indirectly many (or most) projects ended up depending on something which was using an unsupported backdoor API because it provided a marginally useful capability. The module system restricted access to these APIs and everything stopped working, unless you added some magic command line arguments to gain access again.
Is it even theoretically possible for a project like this to not run into these kind of issues? Like literally the project's goal is to enable library authors to be more explicit about their public API. So breaking use cases that use unsupported backdoor APIs very much seems like a predictable and expected result?
Early on there were things you couldn't do without using com.sun.* classes, so people got used to doing that. It's been many years since that was all fixed, though.
If you're truly happy with the language you're using, it's a hard pitch to make.
That said, Nod is my attempt to make a full-featured, professional-strength language that is easier to learn and easier to use in complex and demanding applications like servers and low-level infrastructure.
It has very regular and consistent syntax, making it easy (easier) to learn, read, comprehend, and write. It has simple and complete object-oriented features supporting multiple inheritance and polymorphism. It has simple and complete generic programming features. It has simple and complete exception handling. It has built-in modularity for extensibility and easier third-party integration. It has powerful features to enable simple and automatic external data serialization. It has a rich fundamental type set that supports collections, advanced mathematics, concurrency, real-time synchronization, and unicode. It ultimately compiles to machine code, making it performant.
Bottom line, Nod compares across the board with "modern" C++ and then some.
The "devil's in the details" though, and the details can be accessed via Nod's very basic website.
Most programmers in time will be burned by some aspect of their language, or they'll wish it would do something better. So, many will invest the time in learning a new language if they could see that it works better and/or pays better (especially if it pays better). But, in the end, I'm the first to admit that a new language is a hard pitch to make, even if it's "perfect."