Maybe, but then again, those systems are based on a systems language that the industry has spent 50 years mostly ignoring its design flaws, until governments decided it was about time to start taking liabilities in cybersecurity seriously.
Also I didn't came up with this myself, it is part of their marketing materials and white papers.
Finally, if it was worthless they would have probably dropped it by now.
> Also I didn't came up with this myself, it is part of their marketing materials and white papers.
> Finally, if it was worthless they would have probably dropped it by now.
Companies love to "talk up" their products in marketing materials. It is far from uncommon for those materials to contain claims which, while not entirely false, aren't exactly true either – and I suspect that's what's happening here.
IBM does the same thing – listen to some IBM i person tell you how "advanced" their operating system is compared to everything else – sure, there's some theoretical truth to that, but in practical terms it is more true in the past than in the present
The CPU emulator part got me thinking, because Burroughs B5000 never was bare metal anyway, rather it was also one of the first bytecode based OSes, so it isn't really emulation when running on the Libra systems.
That would like saying Java/Android and .NET applications, or IBM i, are running under an emulator, even though technically a dynamic compiler is a form of emulation.
I think there is a big difference: the original hardware, the “emulator” was in the CPU microcode - so very close to bare metal, and was narrowly targeted to only do what it needed to do.
Compare that to a software emulator running under a commodity general-purpose operating system-it is a lot further from the bare metal, once you consider all the layers in-between (the OS kernel, libc, etc), the trusted computing base is a lot larger: its size has grown dramatically, being general-purpose includes lots of features the emulator doesn’t need or use - so from a security viewpoint, this is in some respects a step backwards - even though made necessary by economics, and simultaneously has some practical security benefits - although the general purpose OS may be theoretically worse from a security perspective, it receives huge amounts of attention, which helps keep it secure; a rarely used proprietary platform, whatever its theoretical advantages, doesn’t receive the same attention, making it more likely vulnerabilities may lurk undiscovered
Also I didn't came up with this myself, it is part of their marketing materials and white papers.
Finally, if it was worthless they would have probably dropped it by now.