Minimax Optimality of Classical Scaling Under General Noise Conditions

Vishwanath, Siddharth, Arias-Castro, Ery

arXiv.org Machine Learning 

We establish the consistency of classical scaling under a broad class of noise models, encompassing many commonly studied cases in literature. Our approach requires only finite fourth moments of the noise, significantly weakening standard assumptions. We derive convergence rates for classical scaling and establish matching minimax lower bounds, demonstrating that classical scaling achieves minimax optimality in recovering the true configuration even when the input dissimilarities are corrupted by noise.