A Proofs of Main Upper and Lower Bounds

Neural Information Processing Systems 

We use the Euclidean-LSH construction of Lemma 4.2 with parameter /L. In any sub-region of the k-dimensional subspace that has a small diameter, the Lipschitz nature of the function together with Lemma 4.2 will imply that we can approximate it by just a constant and incur only error in ` A rare scenario that we have to deal with for the sake of completeness is when there exist buckets of such small volume that no training data point has mapped to them and consequently we don't learn any values in those buckets. At test time, if we encounter this rare scenario of mapping to a bucket with no value learnt in it, we simply run an approximate nearest neighbor search among the train points. For our prediction, we use the bucket value associated with the bucket that the approximate nearest neighbor maps to. To control the error when doing such a procedure, we take enough samples to approximately form an / L cover of for a large enough constant.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found