Knowledge Distillation Performs Partial Variance Reduction

Neural Information Processing Systems 

Knowledge distillation is a popular approach for enhancing the performance of "student" models, with lower representational capacity, by taking advantage of

Similar Docs  Excel Report  more

TitleSimilaritySource
None found