> M4 makes the new iPad Pro an outrageously powerful device for artificial intelligence
Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI capabilities are nice, but hyperbole like this is just asking for snark like "my RTX4090 would like a word".
Other than that: looking forward to when/how this chipset will be available in Macbooks!
No, when using wording like "outrageously powerful", that's exactly the comparison you elicit.
I'd be fine with "best in class" or even "unbeatable performance per Watt", but I can absolutely guarantee you that an iPad does not outperform any current popular-with-the-ML-crowd GPUs...
This is true, but that is only an advantage when running a model larger than the VRAM. If your models are smaller, you'll get substantially better performance in a 4090. So it all comes down to which models you want to run.
It seems like 13b was running fine on 4090, but when I tried all the more fun or intelligent ones became very slow and would have peformed better on m3.
Yes, M3 chips are available with 36GB unified RAM when embedded in a MacBook, although 18GB and below are the norm for most models.
And even though the Apple press release does not even mention memory capacity, I can guarantee you that it will be even less than that on an iPad (simply because RAM is very battery-hungry and most consumers won't care).
So, therefore my remark: it will be interesting to see how this chipset lands in MacBooks.
Yeah, well, I'm an enthusiastic M3 user, and I'm sure the new AI capabilities are nice, but hyperbole like this is just asking for snark like "my RTX4090 would like a word".
Other than that: looking forward to when/how this chipset will be available in Macbooks!