The noise level in linear regression with dependent data

Ziemann, Ingvar, Tu, Stephen, Pappas, George J., Matni, Nikolai

arXiv.org Machine Learning 

Ordinary least squares (OLS) regression from a finite sample is one of the most ubiquitous and widely used technique in machine learning. When faced with independent data, there are now sharp tools available to analyze its success optimally under relatively general assumptions. Indeed, a non-asymptotic theory matching the classical asymptotically optimal understanding from statistics [van der Vaart, 2000] has been developed over the last decade [Hsu et al., 2012, Oliveira, 2016, Mourtada, 2022]. However, once we relax the independence assumption and move toward data that exhibits correlations, the situation is much less well-understood--even for a problem as seemingly simple as linear regression. While sharp asymptotics are available through various limit theorems, there are no general results matching these in the finite sample regime. In this paper, we study the instance-specific performance of ordinary least squares in a setting with dependent data--and in contrast to much contemporary work on the theme--without imposing realizability.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found