e347c51419ffb23ca3fd5050202f9c3d-AuthorFeedback.pdf
–Neural Information Processing Systems
" The algorithm is rather similar to the denoising algorithm referred to in the paper . " The second part of our paper (denoised bigNN) may be viewed as an improvement of the denoising algorithm Indeed, the two algorithms share some similarity. " In the extreme cases... it seems there may be a possibility that " The idea of distributing the data and thus speed up the process of classification seems somewhat inferior to compressing " Given the aim of the paper, I think it's extremely important to experimentally compare against recently " KWS17 is theory oriented and the authors did not provide code. Note our R-based code (to be released publicly) has room for improvement. " Perhaps more discussion/theory on when/why one should use pasting rather than bagging to ensemble k-NN estimators " Separately, depending on the dataset (i.e., the feature space and distribution), I would suspect that even just taking 1 Our proof would work even for only one subsample.
Neural Information Processing Systems
Nov-19-2025, 08:06:07 GMT
- Technology: