Enhancing Radiographic Disease Detection with MetaCheX, a Context-Aware Multimodal Model
–arXiv.org Artificial Intelligence
To bridge this gap, we introduce MetaCheX, a novel multimodal framework that integrates chest X - ray images with structured patient metadata to replicate clinical decision - making. Our approach combines a convolutional neural network (CNN) backbone with metadata processed by a multilayer perceptron through a shared classifier. Evaluated on the CheXpert Plus dataset, MetaCheX consistent ly outperformed radiograph - only baseline models across multiple CNN architectures. By integrating metadata, the overall diagnostic accuracy was significantly improved, measured by an increase in AUROC. The results of this study demonstrate that metadata re duces algorithmic bias and enhances model generalizability across diverse patient populations.
arXiv.org Artificial Intelligence
Sep-17-2025
- Country:
- North America > United States > Virginia (0.04)
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Health & Medicine
- Diagnostic Medicine > Imaging (1.00)
- Nuclear Medicine (1.00)
- Health & Medicine
- Technology: