Goto

Collaborating Authors

 interpolation and approximation


Size and depth of monotone neural networks: interpolation and approximation

Neural Information Processing Systems

Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with n points, and the goal is to find a size and depth efficient monotone neural network with \emph{non negative parameters} and threshold units that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth 2 . On the other hand, we prove that for every monotone data set with n points in \mathbb{R} d, there exists an interpolating monotone network of depth 4 and size O(nd) . Our interpolation result implies that every monotone function over [0,1] d can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth d 1 .