A Deferred Proofs From Section 3 We begin by reviewing standard concepts pertaining to establishing statistical query lower bounds for unsupervised learning problems, as developed in [FGR

Neural Information Processing Systems 

In Section A we provide the deferred proofs from Section 3. In Section B we provide the deferred proofs from Section 4. In Section C we show how to extend our main lower bound to apply to learning in Wasserstein distance. In Section D we fill in the details alluded to in the introduction for how supervised learning lower bounds can imply unsupervised learning lower bounds. Note that when p = q, this is simply the chi-squared divergence between p and r. Let Z be a distributional search problem over distributions D and solutions F. For γ, β, if N = SD(Z, γ, β), then any statistical query algorithm for Z requires at least Nγ/(β γ) queries to STAT( 2γ) or VSTAT(1/(6γ)). We will use the following two lemmas from [DKS17] and [DK20].

Similar Docs  Excel Report  more

TitleSimilaritySource
None found