Goto

Collaborating Authors

 stride


c39e1a03859f9ee215bc49131d0caf33-Supplemental.pdf

Neural Information Processing Systems

Additionally, we show generalization performance of our proposed method across differentvisualdomains. Withthegiven problemcategory(task),asubsetforlearning can be sampled (via domain episode module in Figure 4 in main text). Here, by replacingclass with task, K-shot andN-task reasoning framework can be defined. Here, we show analogical learning with the existing meta learning framework for fast adaptation fromthesourcedomain tothetargetdomain.







How Sparse Can We Prune A Deep Network: A Fundamental Limit Perspective

Neural Information Processing Systems

Network pruning is a commonly used measure to alleviate the storage and computational burden of deep neural networks. However, the fundamental limit of network pruning is still lacking. To close the gap, in this work we'll take a first-principles approach, i.e. we'll directly impose the sparsity constraint on the loss function and leverage the framework of statistical dimension in convex geometry, thus enabling us to characterize the sharp phase transition point, which can be regarded as the fundamental limit of the pruning ratio. Through this limit, we're able to identify two key factors that determine the pruning ratio limit, namely, weight magnitude and network sharpness .