The Wikipedia page says this will replace UH-60s, but I just do not see how that airframe is a direct comparable to what’s been a workhorse for decades. Maybe it means only in a long range reconnaissance role? But even then, that mission is primarily owned by UAS platforms now. Confusing.
I imagine UH-60 and variants will continue to serve (who knows, maybe with new airframes) along side the MV-75 for quite a while, in a similar way to how UH-1s continued to be in use well after UH-60s were deployed in large numbers. This Congressional Research Service summary of the FLRAA/MV-75 program states that the Army has plans to continue ordering UH-60s (on the order of 255 between 2027 and 2031) - https://www.congress.gov/crs-product/IF12771
The key requirements that drive MV-75's downsides (size, complexity, cost) is the Army wants to play game in the Pacific. The UH-60 is deeply limited there.
For example, the MV-75's range should let it go (one-way) from Guam to the Philippines, straight from Okinawa to Taiwan (no need to island hop) - potentially as a two way mission. Same as Philippines to Taiwan.
The "comparability" is that the MV-75 and UH-60 can be delivery ~14 troops into an order magnitude similar size clearing.
Sure, its going to take decades to actually make the transition and the UH-60 will remain in service for decades more after that in less demanding roles. I expect by the time this finishes, the MV-75 will be considered another workhorse, if maybe slightly fuzzier and the UH will be an antiquated platform.
But ultimately they both solve the same problem, moving stuff from A to B in rough terrain fast. But with the ever increasing amount of reconnaissance assets, A needs to be further behind the frontline and so range and speed needs to increase beyond what you can manage with a pure helicopter.
Rust as it exists today is very much "PL theory" driven. It's not necessarily a good language, but it's been consistently ranked as the #1 "most loved" by Stack Overflow for the past few years.
Is webgpu a good standard at this point? I am learning vulkan atm and 1.3 is significantly different to the previous APIs, and apparently webgpu is closer in behavior to 1.0. I am by no means an authority on the topic, I just see a lack of interest in targeting webgpu from people in game engines and scientific computing.
For a text editor it's definitely good enough if not extreme overkill.
Other then that the one big downside of WebGPU is the rigid binding model via baked BindGroup objects. This is both inflexible and slow when any sort of 'dynamism' is needed because you end up creating and destroying BindGroup objects in the hot path.
The modern Vulkan binding model is relatively fine. Your entire program has a single descriptor set containing an array of images that you reference by index. Buffers are never bound and instead referenced by device address.
Apparently "joy to use" is one of the new core goals of Khronos for Vulkan. Whether they succeed remains to be seen, but at least they acknowledge now that a developer hostile API is a serious problem for adoption.
The big advantage of Metal is that you can pick your abstraction level. At the highest level it's convenient like D3D11, at the lowest level it's explicit like D3D12 or Vulkan.
Bevy engine uses wgpu and supports both native and WebGPU browser targets through it.
The WebGPU API gets you to rendering your first triangle quicker and without thinking about vendor-specific APIs and histories of their extensions. It's designed to be fully checkable in browsers, so if you mess up you generally get errors caught before they crash your GPU drivers :)
The downside is that it's the lowest common denominator, so it always lags behind what you can do directly in DX or VK. It was late to get subgroups, and now it's late to get bindless resources. When you target desktops, wgpu can cheat and expose more features that haven't landed in browsers yet, but of course that takes you back to the vendor API fragmentation.
It's a good standard if you want a sort of lowest-common-denominator that is still about a decade newer than GLES 3 / WebGL 2.
The scientific folks don't have all that much reason to upgrade from OpenGL (it still works, after all), and the games folks are often targeting even newer DX/Vulkan/Metal features that aren't supported by WebGPU yet (for example, hardware-accelerated raytracing)
Having no CSD at all is unacceptable on small screens IMHO, far too much real estate is taken up by a title bar, you can be competitive with SSD by making them really thin, but then they are harder to click on and impossible with touch input. At the moment I have firefox setup with CSD and vertical tabs, only 7% of my vertical real estate is taken up by bars (inc. Gnome), which is pretty good for something that supports this many niceties.
I use a lot of obscure libraries for scientific computing and engineering. If I install it from pacman or manage to get an AUR build working, my life is pretty good. If I have to use a Python library the faff becomes unbearable, make a venv, delete the venv, change python version, use conda, use uv, try and install it globally, change python path, source .venv/bin/activate. This is less true for other languages with local package management, but none of them are as frictionless as C (or Zig which I use mostly). The other issue is .venvs, node_packages and equivalents take up huge amounts of disk and make it a pain to move folders around, and no I will not be using a git repo for every throwaway test.
uv has mostly solved the python issue. IME it's dependency resolution is fast and just works. Packages are hard linked from a global cache, which also greatly reduces storage requirements when you work with multiple projects.
uv is great for resolution, but it seems like it doesn't really address the build complexity for heavy native dependencies. If you are doing any serious work with torch or local LLMs, you still run into issues where wheels aren't available for your specific cuda/arch combination. That is usually where I lose time, not waiting for the resolver.
It sounds like your understanding of modern package management is at least ten year out of date, and Python has been (until recently) among the worse, yes, so that definitely wouldn’t have been a model to follow
- AI "collaboration"
- pure maths in a cosmology paper
- Zenodo
- small number of citations from a wide range of dates
- cosmology
One of my favourite youtube videos is Angela Collier's one on cranks, she makes the point that a motivated independent researcher can do science if they choose less ambitious problems, but these people always choose the deepest and most fundamental problems in maths and physics.
Ouch, really? That's basically just work that's not in the current hot topics. Not really a datapoint in favour of 'crank', rather a point against 'active academic/student'. I think it's admirable to look for value in older work.
one suggestion. There is the main paper, as well as supplemental supporting papers on Zenodo. Just download them. Read them. Or...if you don't have time, feed all of them to a reasoning AI and ask for analysis. Ask if it breaks GR. Ask if it is coherent.
Hint. It is. And it is falsifiable...not with stuff that maybe exists either...data that exists now or will in the very near future.
>If you called Netanyahu a monkey because of his Gaza genocide, most people who are pro-palestine will try to cancel you! Not because they think highly of him, but because it hurts the cause more than it helps.
Your reading of the current political climate is very different to mine.
I don't know about that. in my view, you can call him a murderer, genocidal, sociopath, anything related to his actions. But calling him an epithet, comparing him to an animal is a different thing. Even physical violence is more tolerable. of course people can say whatever they want in private, i'm talking about public discourse. terms like "monkey" and "dog" have been used across cultures to mean really nasty things. It's dehumanizing (literally!), it says as much about the speaker as it does about the subject.
when humans say "an animal" in the English language, they're referring to "non-human animals". Being called an animal in itself isn't insulting either before you go there. Hardly anyone would be insulted at being called a lion. I think everyone who can read understands exactly the implication being drawn and the dehumanizing being done. Everyone from slave traders, colonialists, nazis,etc.. have used "monkey" to dehumanize people. Same with "dog" , "snake" ,etc.. in different contexts.