Dissecting Neural ODEs
Massaroli, Stefano, Poli, Michael, Park, Jinkyoo, Yamashita, Atsushi, Asama, Hajime
Continuous deep learning architectures have recently re-emerged as variants of Neural Ordinary Differential Equations (Neural ODEs). The infinite-depth approach offered by these models theoretically bridges the gap between deep learning and dynamical systems; however, deciphering their inner working is still an open challenge and most of their applications are currently limited to the inclusion as generic black-box modules. In this work, we "open the box" and offer a system-theoretic perspective, including state augmentation strategies and robustness, with the aim of clarifying the influence of several design choices on the underlying dynamics. We also introduce novel architectures: among them, a Galerkin-inspired depth-varying parameter model and neural ODEs with data-controlled vector fields.
Feb-19-2020
- Country:
- Asia
- Japan > Honshū
- Kantō > Tokyo Metropolis Prefecture > Tokyo (0.14)
- Middle East > Jordan (0.04)
- South Korea > Daejeon
- Daejeon (0.04)
- Japan > Honshū
- Europe > Russia
- Central Federal District > Moscow Oblast > Moscow (0.04)
- North America > United States
- Rhode Island > Providence County
- Providence (0.04)
- Texas > Menard County (0.04)
- Rhode Island > Providence County
- Asia
- Genre:
- Overview (0.46)
- Research Report (0.50)
- Technology: