Chebyshev Moment Regularization (CMR): Condition-Number Control with Moment Shaping

Baek, Jinwoo

arXiv.org Artificial Intelligence 

We introduce \textbf{Chebyshev Moment Regularization (CMR)}, a simple, architecture-agnostic loss that directly optimizes layer spectra. CMR jointly controls spectral edges via a log-condition proxy and shapes the interior via Chebyshev moments, with a decoupled, capped mixing rule that preserves task gradients. We prove strictly monotone descent for the condition proxy, bounded moment gradients, and orthogonal invariance. In an adversarial ``$κ$-stress'' setting (MNIST, 15-layer MLP), \emph{compared to vanilla training}, CMR reduces mean layer condition numbers by $\sim\!10^3$ (from $\approx3.9\!\times\!10^3$ to $\approx3.4$ in 5 epochs), increases average gradient magnitude, and restores test accuracy ( $\approx10\%\!\to\!\approx86\%$ ). These results support \textbf{optimization-driven spectral preconditioning}: directly steering models toward well-conditioned regimes for stable, accurate learning.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found