Graph Expansion in Pruned Recurrent Neural Network Layers Preserve Performance
Kalra, Suryam Arnav, Biswas, Arindam, Mitra, Pabitra, Basu, Biswajit
–arXiv.org Artificial Intelligence
Expansion property of a graph refers to its strong connectivity as well as sparseness. It has been reported that deep neural networks can be pruned to a high degree of sparsity while maintaining their performance. Such pruning is essential for performing real time sequence learning tasks using recurrent neural networks in resource constrained platforms. We prune recurrent networks such as RNNs and LSTMs, maintaining a large spectral gap of the underlying graphs and ensuring their layerwise expansion properties. We also study the time unfolded recurrent network graphs in terms of the properties of their bipartite layers. Experimental results for the benchmark sequence MNIST, CIFAR-10, and Google speech command data show that expander graph properties are key to preserving classification accuracy of RNN and LSTM. Analysis of Artificial Neural Networks (ANNs) following a connection base approach is a topical research direction as this not only mimics brain networks in neuroscience, but also can provide specific graph measures which can be used for analysis of performance and robustness of the networks. Researchers in the recent years have explored if there is a relation between the functional aspects of an ANN and its graph structure, and if such a relation does exist then are there any characterization that explains the relationship between the structure of the graph and is performance LeCun et al. (1998); Sermanet et al. (2013); Zeiler & Fergus (2014); Krizhevsky et al. (2017); Simonyan & Zisserman (2014); He et al. (2016); Szegedy et al. (2015).
arXiv.org Artificial Intelligence
Mar-17-2024
- Country:
- Asia > India
- West Bengal > Kharagpur (0.05)
- Europe
- France > Île-de-France
- Ireland > Leinster
- County Dublin > Dublin (0.14)
- Asia > India
- Genre:
- Research Report (0.50)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (0.54)
- Technology: