d010396ca8abf6ead8cacc2c2f2f26c7-AuthorFeedback.pdf

Neural Information Processing Systems 

R2: Sparsity: Thank you for mentioning thatthe unification ofcoreparts ofthe sparseneuralnetwork literature14 combined with the neural architecture search problem is very nice. As a useful experiment to include to showcase15 this, we apply Algorithm 1 separately to each filter of the version of ResNet50 used in [1] and obtain state of the16 art performance atthe task oftraining sparse networks from scratchas described in [1]. R2: Theoretical Claims: We agree that this work will benefit from theoretical analysis which extends beyond25 decreasing the loss on the mini-batch with the SGD update. We thank you for the excellent suggestion of offering the additional motivation of31 performing an approximation in the backwards pass. We are happy to see that we may arrive at the same objec-32 tive in this way and we will include this exciting result.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found