Reviews: LCA: Loss Change Allocation for Neural Network Training

Neural Information Processing Systems 

There is some disagreement about this paper among reviewers. There is a common appreciation for this line of study and specifically the new loss contribution (LC) metric proposed. As many things about the training process of DNNs remains "mysterious", developing new and better "lenses" through which we can look at the inner workings of a DNN can be of great value for the field. The criticism in the less enthusiastic reviews is largely around "more effort": comparison to other approaches, more experiments, clarifications and improvements, making it more actionable. One can also give that a positive spin: there is a lot of interesting follow-up work to be done here.