Estimating Conditional Mutual Information for Dynamic Feature Selection
Gadgil, Soham, Covert, Ian, Lee, Su-In
–arXiv.org Artificial Intelligence
Dynamic feature selection, where we sequentially query features to make accurate predictions with a minimal budget, is a promising paradigm to reduce feature acquisition costs and provide transparency into a model's predictions. The problem is challenging, however, as it requires both predicting with arbitrary feature sets and learning a policy to identify valuable selections. Here, we take an information-theoretic perspective and prioritize features based on their mutual information with the response variable. The main challenge is implementing this policy, and we design a new approach that estimates the mutual information in a discriminative rather than generative fashion. Building on our approach, we then introduce several further improvements: allowing variable feature budgets across samples, enabling non-uniform feature costs, incorporating prior information, and exploring modern architectures to handle partial inputs. Our experiments show that our method provides consistent gains over recent methods across a variety of datasets.
arXiv.org Artificial Intelligence
Oct-6-2023
- Country:
- North America > United States (0.14)
- Genre:
- Research Report > New Finding (0.68)
- Industry:
- Health & Medicine
- Diagnostic Medicine (1.00)
- Therapeutic Area (0.68)
- Health & Medicine
- Technology: