Deep ReLU Networks Have Surprisingly Few Activation Patterns

Boris Hanin, David Rolnick

Neural Information Processing Systems 

In this article, we attempt to capture the difference between the maximum complexity of deep networks and the complexity of functions that are actually learned (see Figure 1).

Similar Docs  Excel Report  more

TitleSimilaritySource
None found