Reviews: Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning
–Neural Information Processing Systems
This work proposes to use semi-supervised learning, in the form of an unsupervised loss term, for improving the regularization capacity of CNNs. The idea (and the proposed loss) is conceptually simple and enforces stability explicitly by minimizing the difference between predictions corresponding to the same input data point. The paper focuses mainly on the experimental side, devoting the largest part in presenting results when adding the new loss on standard supervised CNNs. This is the stronger aspect of this work, with the weaker being the lack (or the definition) of baselines and the lack of some form of theoretical justification, derivation or discussion. Novelty/originality: The main contribution is the application of the unsupervised loss term for controlling the stability of the predictions under transformations or stochastic variability.
Neural Information Processing Systems
Jan-20-2025, 08:59:40 GMT