Local EGOP for Continuous Index Learning
Kokot, Alex, Hemmady, Anand, Thiyageswaran, Vydhourie, Meila, Marina
We introduce the setting of continuous index learning, in which a function of many variables varies only along a small number of directions at each point. For efficient estimation, it is beneficial for a learning algorithm to adapt, near each point $x$, to the subspace that captures the local variability of the function $f$. We pose this task as kernel adaptation along a manifold with noise, and introduce Local EGOP learning, a recursive algorithm that utilizes the Expected Gradient Outer Product (EGOP) quadratic form as both a metric and inverse-covariance of our target distribution. We prove that Local EGOP learning adapts to the regularity of the function of interest, showing that under a supervised noisy manifold hypothesis, intrinsic dimensional learning rates are achieved for arbitrarily high-dimensional noise. Empirically, we compare our algorithm to the feature learning capabilities of deep learning. Additionally, we demonstrate improved regression quality compared to two-layer neural networks in the continuous single-index setting.
Jan-21-2026
- Country:
- Europe
- Denmark > Capital Region
- Copenhagen (0.04)
- Switzerland
- Basel-City > Basel (0.04)
- Zürich > Zürich (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Oxfordshire > Oxford (0.04)
- Denmark > Capital Region
- North America > United States
- New York > New York County > New York City (0.14)
- Europe
- Genre:
- Research Report (0.50)
- Technology: