I think to be useful it needs a mode of playing, that is always musical / in tune.
Not yet sure how to really do it, but one concept I like from NI plugins is that you have multiple keyboard zones: one zone is for notes, others are e.g. for patterns or styles. Imagine a guitar where one zone is for the chord type and tone, another for the striking pattern...
The challenge here is probably the resonance algo for multiple systems based on multiple notes... Maybe the piano concept would be handy here... imagine instead of having 3 strings like on the piano the instrument to be one system for each key... that excite each other via air or direct resonance points... the systems should be automatically tuned based on one reference system (e.g. using automatic string length or tension scaling)
Anyway, amazing work and having it on GPU allows this really to scale.
An M3 Ultra is two M3 Max chips connected via fabric, so physics.
Did not mean to shit on anyone's parade, but it's a trap for novices, with the caveat that you reportedly can't buy a GB10 until "May 2025" and the expectation that it will be severely supply constrained. For some (overfunded startups running on AI monkey code? Youtube Influencers?), that timing is an unacceptable risk, so I do expect these things to fly off the shelves and then hit eBay this Summer.
I think it depends a lot on how research AI generated traffic counts towards these stats.
Also, if there is no answer yet on the web the AI may also not know it. Then these questions should still end up on SO.
I might add, that SO also could build their own chat / research UI. It would need to have some benefit over others, but I guess the community aspect of it alone would suffice...
I think the title of the paper is misleading. Obviously the result shows an impressive performance with just few training examples. However, I cannot see that while keeping the same method reducing training data leads to more performance. They have simply shifted the performance curve (impressively) to lower thresholds. Still also with this new method more training data should give better results. It would be interesting to see a full performance curve for the method based on training data amount (and potentially quality).
just like Bell Labs UNIX gave way to BSD UNIX which gave way to SunOS/Solaris which gave way to Linux... this is a tale as old as time. The question is will this end the same way?
that is a horrible way to be treated by a doctor. In Germany you have free choice of your doctors. Although sometimes you have to wait quite a bit to get appointments...
I am not sure, but with a a European health insurance card you should be able to also go to doctors in other countries and be at least partially covered by your insurance.
one of the annoying things about Mathematica is that all functions are crammed into the same namespace and that there is no overloading with different parameterization options...
What do you mean by overloading? Functions can easily have different behavior with different argument counts (e.g., 2- vs. 3-argument Fold), and they can have any number of options (e.g., Graphics, Graphics3D, Solve, Import/Export, etc.). The only big redundancy I can think of is the various Plot functions.