I've always thought of regressions, even high-order ones, as just a statistical tool. They're present at the start of ML courses, sure, but as a tool used in ML techniques or a good alternative to them.
It looks like that's not the standard view, though.
Neural networks are just functional approximators, so why isn't a linear regressor of k-th order (e.g. Taylor expansion up to k-th order) also considered "ML"? What's the distinction here?
It looks like that's not the standard view, though.