People should stop wanting "characters from their strings" especially in the sort of high level software you'd attempt in Java - and Java was in a good position to do that the way we've successfully done it for similar things, by not providing the misleading API shape. Reserve char but don't implement it is what I'm saying, like goto.
Compare for example decryption, where we learned not to provide decrypt(someBytes) and checkIntegrity(someBytes) even though that's what People often want, it's a bad idea. Instead we provide decrypt(wholeBlock) and you can't call it until you've got a whole block we can do integrity checks on, it fails without releasing bogus plaintext if the block was tampered with. An entire class of stupid bugs becomes impossible.
Java should have provided APIs that work on Strings, and said if you think you care about the things Strings are made up of, either you need a suitable third party API (e.g. text rendering, spelling) or you want bytes because that's how Strings are encoded for transmission over the network or storage on disk. You don't want to treat the string as a series of "characters" because they aren't.
The idea that a String is just a vector of characters is wrong, that's not what it is at all. A very low level language like C, C++ or Rust can be excused for exposing something like that, because it's necessary to the low-level machinery, but almost nobody should be programming that layer.
Imagine if Java insisted on acting as though your Java references were numbers and that it could make sense to add them together. Sure in fact they are pointers, and the pointer is an integral type and so you could mechanically add them together, but that's nonsense, you would never write code that needs to do this in Java.
K&R C claimed that char isn't just for representing "ASCII" (which wasn't at that time set in stone as the encoding you'll be using) but for representing the characters on the system you're programming regardless of whether they're ASCII. 'A' wasn't defined as 65 but as whatever the code happens to be for A on your computer. Presumably the current ISO C doesn't make the same foolish claim.
Compare for example decryption, where we learned not to provide decrypt(someBytes) and checkIntegrity(someBytes) even though that's what People often want, it's a bad idea. Instead we provide decrypt(wholeBlock) and you can't call it until you've got a whole block we can do integrity checks on, it fails without releasing bogus plaintext if the block was tampered with. An entire class of stupid bugs becomes impossible.
Java should have provided APIs that work on Strings, and said if you think you care about the things Strings are made up of, either you need a suitable third party API (e.g. text rendering, spelling) or you want bytes because that's how Strings are encoded for transmission over the network or storage on disk. You don't want to treat the string as a series of "characters" because they aren't.
The idea that a String is just a vector of characters is wrong, that's not what it is at all. A very low level language like C, C++ or Rust can be excused for exposing something like that, because it's necessary to the low-level machinery, but almost nobody should be programming that layer.
Imagine if Java insisted on acting as though your Java references were numbers and that it could make sense to add them together. Sure in fact they are pointers, and the pointer is an integral type and so you could mechanically add them together, but that's nonsense, you would never write code that needs to do this in Java.
K&R C claimed that char isn't just for representing "ASCII" (which wasn't at that time set in stone as the encoding you'll be using) but for representing the characters on the system you're programming regardless of whether they're ASCII. 'A' wasn't defined as 65 but as whatever the code happens to be for A on your computer. Presumably the current ISO C doesn't make the same foolish claim.