A foolproof way to shrink deep learning models

#artificialintelligence 

As more artificial intelligence applications move to smartphones, deep learning models are getting smaller to allow apps to run faster and save battery power. Now, MIT researchers have a new and better way to compress models. It's so simple that they unveiled it in a tweet last month: Train the model, prune its weakest connections, retrain the model at its fast, early training rate, and repeat, until the model is as tiny as you want. "That's it," says Alex Renda, a PhD student at MIT. "The standard things people do to prune their models are crazy complicated." Renda discussed the technique when the International Conference of Learning Representations (ICLR) convened remotely this month.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found