On the Sample Complexity of Learning under Invariance and Geometric Stability
–Neural Information Processing Systems
Learning from high-dimensional data is known to be statistically intractable without strong assumptions on the problem. A canonical example is learning Lipschitz functions, which generally requires a number of samples exponential in the dimension due to the curse of dimensionality ( e.g., [
Neural Information Processing Systems
Aug-16-2025, 08:05:52 GMT