Robust Convolution Neural ODEs via Contractivity-promoting regularization

Zakwan, Muhammad, Xu, Liang, Ferrari-Trecate, Giancarlo

arXiv.org Artificial Intelligence 

-- Neural networks can be fragile to input noise and adversarial attacks. In this work, we consider Convolutional Neural Ordinary Differential Equations (NODEs) - a family of continuous-depth neural networks represented by dynamical systems - and propose to use contraction theory to improve their robustness. Contractive Convolutional NODEs can enjoy increased robustness as slight perturbations of the features do not cause a significant change in the output. Contractivity can be induced during training by using a regularization term involving the Jacobian of the system dynamics. T o reduce the computational burden, we show that it can also be promoted using carefully selected weight regularization terms for a class of NODEs with slope-restricted activation functions. The performance of the proposed regularizers is illustrated through benchmark image classification tasks on MNIST and Fashion-MNIST datasets, where images are corrupted by different kinds of noise and attacks.