Goto

Collaborating Authors

 reviewer1


464074179972cbbd75a39abc6954cd12-AuthorFeedback.pdf

Neural Information Processing Systems

We are grateful to the reviewers for the insightful comments on our submission. Tothisend, weintended toremove53 any over-fitting effect by using a dense training set for clear illustration in the first and fourth columns in Figure 2.54 (ii) 4.2 istoshowhowtheNLL helps alleviate theover-fitting issue, where thevalidation setwith 10,000samples55 were generated uniformly inΩ.


f7a82ce7e16d9687e7cd9a9feb85d187-AuthorFeedback.pdf

Neural Information Processing Systems

The results are impressive, non-trivial and interesting"5 (Reviewer4). Also41 note that the previous applications of VC theory were for non-robust learning, hence somewhat different from the42 current application thatrequires severalnewideas. Finally, Reviewer 3 asks about the time complexity of the paper's two efficient algorithms: learning piecewise51 polynomials, and interval classification.


To Reviewer1: 1. Method simplistic, places too much constraints on activation (only ReLU-like activations)

Neural Information Processing Systems

We believe the proposed H-regularization is novel and by no means simplistic. It is well suited for one-class learning. ReLU-like activations are widely used, e.g., Transformer, Resnet, etc. It does not affect the application of our method. In our experiments, we followed baselines and used the same datasets as them.