Framing RNN as a kernel method: A neural ODE approach
Fermanian, Adeline, Marion, Pierre, Vert, Jean-Philippe, Biau, Gérard
Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.
Jun-2-2021
- Country:
- Europe (0.28)
- North America > United States
- Massachusetts (0.14)
- Genre:
- Research Report (0.70)
- Technology: