On Infinite-Width Hypernetworks

Neural Information Processing Systems 

We show that unlike typical architectures, infinitely wide hypernetworks do not guarantee convergence to a global minima under gradient descent.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found