Review for NeurIPS paper: Consistent feature selection for analytic deep neural networks
–Neural Information Processing Systems
This paper shows that the adaptive group lasso feature selection method, more specifically a combined strategy called GL AGL, is selection consistent for a very general class of Deep Neural Networks, provided the the DNN interacts with the input through a finite set of linear units. This is an important property since it provides a guarantee that, with enough training examples, GL AGL will effectively identify the set of relevant inputs; making the DNN more interpretable. The general structure of the proof follows the analysis of high-dimensional linear models, but new technical elements are introduced to tackle de difficulties introduced when the linear transformation of the first layer is followed by a sequence of non-linear transformations typically used in DNNs. Finally the numerical experiments provide evidence that the popular group lasso method might be an inefficient feature selection method for DNNs.
Neural Information Processing Systems
Jan-22-2025, 04:37:05 GMT
- Technology: