Computing Linear Regions in Neural Networks with Skip Connections

Joyce, Johnny, Verschelde, Jan

arXiv.org Artificial Intelligence 

A neural network is a composition of neurons, where each neuron can be represented as a nonlinear function depending on inputs and parameters, called weights and biases. The nonlinearity of the network can be understood via tropical geometry, in particular for networks with ReLU activation functions, which are piecewise linear. For such networks, we introduce an algorithm to compute all linear regions of a neural network. A linear region of a neural network is a connected region on which the map defined by the network is linear. Knowing those linear regions allows for quicker predictions, as demonstrated by our new caching algorithm. Our algorithms work for networks with skip connections. Skip connections add the output of previous layers to the input of later layers, skipping over the layers in between. The expository paper [2] offers promising avenues to study neural networks.