Most machine learning is not on a band limited signal. I've literally seen these tactics applied to demand forecasting. And I just can't square that they should.
Ah, I see what you mean. Yes, if you're not dealing with approximately bandlimited and sampled signals, then this wouldn't apply. The article is about embedded devices processing sensor data (microphones, motion/light sensors, accelerometers, etc.), and in those cases the signal of interest will often be bandlimited.
Well, natural signals that have an end arent band-limited either. It is a mathematical abstraction that approximates many real world scenarios well enough
But our perception of many things can effectively be band limited with no loss in generality to work with the data. I'm unconvinced this is the case in places ml is often used.
Note, I have to hedge and say I am not convinced they are inapplicable. Just not convinced they are applicable.
Also note, I hadn't gotten the article to load when I fired off my concern. I keep the concern, but ack that it is not applicable to this article.
As I said in another comment I find it hard to separate what is Signal processing, vs what's Information Theory vs what's is ML. I have heard the argument that "if its got trigonometry then its signal processing", or "if its 1 dimensional then its signal processing" I find these arguments pretty weak and unconvincing.
Officially I belong in the ML tribe but all of them are tackling pretty much the exact same problem, any breakthrough in one of them will translate to the others. The name of the topic has changed over the years, the fundamental problem has remained the same -- lets call it another name -- approximating/extracting an unknown function from samples.
I agree they are all related. Just push back on the applicability of some techniques in places we don't actually know there are signals. If that makes sense.