Goto

Collaborating Authors

 levee






Training Neural Networks is NP-Hard in Fixed Dimension

Froese, Vincent, Hertrich, Christoph

arXiv.org Artificial Intelligence

We study the parameterized complexity of training two-layer neural networks with respect to the dimension of the input data and the number of hidden neurons, considering ReLU and linear threshold activation functions. Albeit the computational complexity of these problems has been studied numerous times in recent years, several questions are still open. We answer questions by Arora et al. [ICLR '18] and Khalife and Basu [IPCO '22] showing that both problems are NP-hard for two dimensions, which excludes any polynomial-time algorithm for constant dimension. We also answer a question by Froese et al. [JAIR '22] proving W[1]-hardness for four ReLUs (or two linear threshold neurons) with zero training error. Finally, in the ReLU case, we show fixed-parameter tractability for the combined parameter number of dimensions and number of ReLUs if the network is assumed to compute a convex map. Our results settle the complexity status regarding these parameters almost completely.


Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

Bertschinger, Daniel, Hertrich, Christoph, Jungeblut, Paul, Miltzow, Tillmann, Weber, Simon

arXiv.org Artificial Intelligence

We consider the algorithmic problem of finding the optimal weights and biases for a two-layer fully connected neural network to fit a given set of data points. This problem is known as empirical risk minimization in the machine learning community. We show that the problem is $\exists\mathbb{R}$-complete. This complexity class can be defined as the set of algorithmic problems that are polynomial-time equivalent to finding real roots of a polynomial with integer coefficients. Furthermore, we show that arbitrary algebraic numbers are required as weights to be able to train some instances to optimality, even if all data points are rational. Our results hold even if the following restrictions are all added simultaneously. $\bullet$ There are exactly two output neurons. $\bullet$ There are exactly two input neurons. $\bullet$ The data has only 13 different labels. $\bullet$ The number of hidden neurons is a constant fraction of the number of data points. $\bullet$ The target training error is zero. $\bullet$ The ReLU activation function is used. This shows that even very simple networks are difficult to train. The result explains why typical methods for $\mathsf{NP}$-complete problems, like mixed-integer programming or SAT-solving, cannot train neural networks to global optimality, unless $\mathsf{NP}=\exists\mathbb{R}$. We strengthen a recent result by Abrahamsen, Kleist and Miltzow [NeurIPS 2021].


For Safer Coastal Living, Thank Engineers

#artificialintelligence

Those who live along a coastline or vacation there have the privilege of access to activities such as fishing, surfing, and swimming. Mother Nature can turn those coastal regions into dangerous places, however. Threats can come in the form of flash floods, hurricanes, and tropical storms. Those people have to contend with natural disasters and face environmental dangers such as higher tides, lower shorelines, and heightened flood risks. Over the years, engineers have completed projects that make coastal living much safer, such as building effective seawalls and storm drains.


Americans who live near border say Trump's wall is unwelcome

PBS NewsHour

Passengers embark on the U.S. side of the last hand-pulled ferry at Los Ebanos, Texas on the Mexico-U.S. border in 2008. LOS EBANOS, Texas -- All along the winding Rio Grande, the people who live in this bustling, fertile region where the U.S. border meets the Gulf of Mexico never quite understood how Donald Trump's great wall could ever be much more than campaign rhetoric. Erecting a concrete barrier across the entire 1,954-mile frontier with Mexico, they know, collides head-on with multiple realities: the geology of the river valley, fierce local resistance and the immense cost. An electronically fortified "virtual wall" with surveillance technology that includes night-and-day video cameras, tethered observation balloons and high-flying drones makes a lot more sense to people here. If a 30- to 40-foot concrete wall is a panacea for illegal immigration, as Trump insisted during the campaign, the locals are not convinced.