Goto

Collaborating Authors

 deep convolutional representation


Invariance and Stability of Deep Convolutional Representations

Neural Information Processing Systems

In this paper, we study deep signal representations that are near-invariant to groups of transformations and stable to the action of diffeomorphisms without losing signal information. This is achieved by generalizing the multilayer kernel introduced in the context of convolutional kernel networks and by studying the geometry of the corresponding reproducing kernel Hilbert space. We show that the signal representation is stable, and that models from this functional space, such as a large class of convolutional neural networks, may enjoy the same stability.


Reviews: Invariance and Stability of Deep Convolutional Representations

Neural Information Processing Systems

The primary focus of the paper is CKN (convolutional kernel network) [13, 14]. C 1 diffeomorphisms (such as translation), in the sense of Eq. (4)] of the representation formed by CKNs. They show that for norm-preserving and non-expansive kernels [(A1-A2) in line 193] stability holds for appropriately chosen patch sizes [(A3)]. Extension from (R d,) to locally compact groups is sketched in Section 4. The paper is nicely organized, clearly written, technically sound, combining ideas from two exciting areas (deep networks and kernels). The stability result can be of interest to the ML community.


Invariance and Stability of Deep Convolutional Representations

Bietti, Alberto, Mairal, Julien

Neural Information Processing Systems

In this paper, we study deep signal representations that are near-invariant to groups of transformations and stable to the action of diffeomorphisms without losing signal information. This is achieved by generalizing the multilayer kernel introduced in the context of convolutional kernel networks and by studying the geometry of the corresponding reproducing kernel Hilbert space. We show that the signal representation is stable, and that models from this functional space, such as a large class of convolutional neural networks, may enjoy the same stability. Papers published at the Neural Information Processing Systems Conference.