To Reviewer1: 1. Method simplistic, places too much constraints on activation (only ReLU-like activations)
–Neural Information Processing Systems
We believe the proposed H-regularization is novel and by no means simplistic. It is well suited for one-class learning. ReLU-like activations are widely used, e.g., Transformer, Resnet, etc. It does not affect the application of our method. In our experiments, we followed baselines and used the same datasets as them.
Neural Information Processing Systems
Aug-16-2025, 20:21:35 GMT