Goto

Collaborating Authors

 generality


Neural Networks Trained to Solve Differential Equations Learn General Representations

Neural Information Processing Systems

We introduce a technique based on the singular vector canonical correlation analysis (SVCCA) for measuring the generality of neural network layers across a continuously-parametrized set of tasks. We illustrate this method by studying generality in neural networks trained to solve parametrized boundary value problems based on the Poisson partial differential equation. We find that the first hidden layers are general, and that they learn generalized coordinates over the input domain. Deeper layers are successively more specific.







We would like to emphasize that Theorem 1 is the most important contribution of our paper due to its generality

Neural Information Processing Systems

We thank the reviewers for their insightful feedback, and we appreciate the opportunity to improve our paper. We would like to emphasize that Theorem 1 is the most important contribution of our paper due to its generality. In the Gaussian case, our sample complexity result follows directly from the expression for the optimal loss. Finally, while Dohmatob's bounds become non-trivial only when the adversarial We will also add a clearer description of the "translate and pair in place" coupling. Comparisons with Sinha et al. are in Section 7 and we compare to Dohmatob above.