Mathieu Salzmann
Learning the Number of Neurons in Deep Networks
Jose M. Alvarez, Mathieu Salzmann
- Oceania > Australia > Australian Capital Territory > Canberra (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- Europe > Spain > Catalonia > Barcelona Province > Barcelona (0.04)
Backpropagation-Friendly Eigendecomposition
Wei Wang, Zheng Dang, Yinlin Hu, Pascal Fua, Mathieu Salzmann
- North America > Canada > Ontario > Toronto (0.14)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- Asia > Myanmar > Tanintharyi Region > Dawei (0.04)
- Asia > China > Shaanxi Province > Xi'an (0.04)
Deep Subspace Clustering Networks
Pan Ji, Tong Zhang, Hongdong Li, Mathieu Salzmann, Ian Reid
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > Middle East > Jordan (0.04)
Compression-aware Training of Deep Networks
Jose M. Alvarez, Mathieu Salzmann
- North America > United States > California > Santa Clara County > Los Altos (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
Backpropagation-Friendly Eigendecomposition
Wei Wang, Zheng Dang, Yinlin Hu, Pascal Fua, Mathieu Salzmann
- North America > Canada > Ontario > Toronto (0.14)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
- (2 more...)
Compression-aware Training of Deep Networks
Jose M. Alvarez, Mathieu Salzmann
In recent years, great progress has been made in a variety of application domains thanks to the development of increasingly deeper neural networks. Unfortunately, the huge number of units of these networks makes them expensive both computationally and memory-wise. To overcome this, exploiting the fact that deep networks are over-parametrized, several compression strategies have been proposed. These methods, however, typically start from a network that has been trained in a standard manner, without considering such a future compression. In this paper, we propose to explicitly account for compression in the training process. To this end, we introduce a regularizer that encourages the parameter matrix of each layer to have low rank during training. We show that accounting for compression during training allows us to learn much more compact, yet at least as effective, models than state-of-the-art compression techniques.
- North America > United States > California > Santa Clara County > Los Altos (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- North America > United States > California > Alameda County > Berkeley (0.04)
- Europe > Switzerland > Vaud > Lausanne (0.04)
Deep Subspace Clustering Networks
Pan Ji, Tong Zhang, Hongdong Li, Mathieu Salzmann, Ian Reid
We present a novel deep neural network architecture for unsupervised subspace clustering. This architecture is built upon deep auto-encoders, which non-linearly map the input data into a latent space. Our key idea is to introduce a novel self-expressive layer between the encoder and the decoder to mimic the "selfexpressiveness" property that has proven effective in traditional subspace clustering. Being differentiable, our new self-expressive layer provides a simple but effective way to learn pairwise affinities between all data points through a standard backpropagation procedure. Being nonlinear, our neural-network based method is able to cluster data points having complex (often nonlinear) structures. We further propose pre-training and fine-tuning strategies that let us effectively learn the parameters of our subspace clustering networks. Our experiments show that our method significantly outperforms the state-of-the-art unsupervised subspace clustering techniques.
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Asia > Middle East > Jordan (0.04)