Goto

Collaborating Authors

 fc layer






Author Response for The Unreasonable Effectiveness of Big Models for Semi Supervised Learning

Neural Information Processing Systems

We thank the reviewers for feedback, as well as efforts in reviewing. We respond to each comment below. Overall, there is no significant contribution to unsupervised pre-training. " The fact that our main contribution is a detailed procedure, rather than a theorem, architecture, or other artifact, We believe our contributions are significant. Indeed, R3 recognizes that "the simple semi-supervised framework is still I think it will inspire several future works." " While we believe ImageNet is a much more These results can be further improved with better augmentations during fine-tuning and an extra distillation step.






Neural Networks on Symmetric Spaces of Noncompact Type

Nguyen, Xuan Son, Yang, Shuo, Histace, Aymeric

arXiv.org Machine Learning

Recent works have demonstrated promising performances of neural networks on hyperbolic spaces and symmetric positive definite (SPD) manifolds. These spaces belong to a family of Riemannian manifolds referred to as symmetric spaces of noncompact type. In this paper, we propose a novel approach for developing neural networks on such spaces. Our approach relies on a unified formulation of the distance from a point to a hyperplane on the considered spaces. We show that some existing formulations of the point-to-hyperplane distance can be recovered by our approach under specific settings. Furthermore, we derive a closed-form expression for the point-to-hyperplane distance in higher-rank symmetric spaces of noncompact type equipped with G-invariant Riemannian metrics. The derived distance then serves as a tool to design fully-connected (FC) layers and an attention mechanism for neural networks on the considered spaces. Our approach is validated on challenging benchmarks for image classification, electroencephalogram (EEG) signal classification, image generation, and natural language inference.