Reviews: Toward Deeper Understanding of Neural Networks: The Power of Initialization and a Dual View on Expressivity

Neural Information Processing Systems 

In fact, there is almost no discussion at all about the implication of the results of the paper. The notions of "computational skeleton" and "realisation of a skeleton" seem relatively interesting and aim at generalising the kernel construction proposed in [13] and [29]. This "construction" is in my view the main contribution of the paper. On a theoretical point of view and from my understanding, the main results of the paper are Thm 3. and 4. However, these results are not "so surprising", since, as confirmed by the authors, they can be interpreted as a kind of "law of large number": the random activation of the network is "counterbalanced" by the replication of the skeleton.