Goto

Collaborating Authors

 layer neural network






9b8b50fb590c590ffbf1295ce92258dc-Paper.pdf

Neural Information Processing Systems

The problem of learning the parameters of a neural network is two-fold. First, we want that their training on a set of data via minimization of a suitable loss function succeed in finding a set of parameters for which the value of the loss is close to its global minimum.




Response for paper ID8808

Neural Information Processing Systems

We thank all reviewers for their thoughtful feedback. Please find detailed responses to your comments below. Thank you very much for carefully reading our paper and your supportive comments. Thank you very much for carefully reading our paper and your supportive comments. Theorem 2 is a lot to unpack.


Supplementary Material Training for the Future: A Simple Gradient Interpolation Loss to Generalize Along Time

Neural Information Processing Systems

In the main text, many algorithmic details were omitted and only discussed briefly. A.1 Dataset Details We expand upon the seven datasets used for our experiments in this section. The task is multi-class classification with a heavy class imbalance. It has 8 features including price, day of the week and units transferred. We discard instances with missing values.


Uncovering Critical Sets of Deep Neural Networks via Sample-Independent Critical Lifting

Zhang, Leyang, Zhang, Yaoyu, Luo, Tao

arXiv.org Artificial Intelligence

This paper investigates the sample dependence of critical points for neural networks. We introduce a sample-independent critical lifting operator that associates a parameter of one network with a set of parameters of another, thus defining sample-dependent and sample-independent lifted critical points. We then show by example that previously studied critical embeddings do not capture all sample-independent lifted critical points. Finally, we demonstrate the existence of sample-dependent lifted critical points for sufficiently large sample sizes and prove that saddles appear among them.