Modality-Projection Universal Model for Comprehensive Full-Body Medical Imaging Segmentation

Chen, Yixin, Gao, Lin, Gao, Yajuan, Wang, Rui, Lian, Jingge, Meng, Xiangxi, Duan, Yanhua, Chai, Leiying, Han, Hongbin, Cheng, Zhaoping, Xie, Zhaoheng

arXiv.org Artificial Intelligence 

The integration of deep learning in medical imaging has shown great promise for enhancing diagnostic, therapeutic, and research outcomes. However, applying universal models across multiple modalities remains challenging due to the inherent variability in data characteristics. This study aims to introduce and evaluate a Modality Projection Universal Model (MPUM). MPUM employs a novel modality-projection strategy, which allows the model to dynamically adjust its parameters to optimize performance across different imaging modalities. The MPUM demonstrated superior accuracy in identifying anatomical structures, enabling precise quantification for improved clinical decision-making. It also identifies metabolic associations within the brain-body axis, advancing research on brain-body physiological correlations. Furthermore, MPUM's unique controller-based convolution layer enables visualization of saliency maps across all network layers, significantly enhancing the model's interpretability.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found