Rethinking Transfer Learning for Medical Image Classification
Peng, Le, Liang, Hengyue, Luo, Gaoxiang, Li, Taihui, Sun, Ju
–arXiv.org Artificial Intelligence
Transfer learning (TL) from pretrained deep models is a standard practice in modern medical image classification (MIC). However, what levels of features to be reused are problem-dependent, and uniformly finetuning all layers of pretrained models may be suboptimal. This insight has partly motivated the recent differential TL strategies, such as TransFusion (TF) and layer-wise finetuning (LWFT), which treat the layers in the pretrained models differentially. In this paper, we add one more strategy into this family, called TruncatedTL, which reuses and finetunes appropriate bottom layers and directly discards the remaining layers. This yields not only superior MIC performance but also compact models for efficient inference, compared to other differential TL methods. Our code is available at: https://github.com/sun-umn/TTL
arXiv.org Artificial Intelligence
Dec-16-2023
- Country:
- Africa > Middle East
- Egypt > Cairo Governorate > Cairo (0.04)
- Asia > China
- Guangdong Province > Shenzhen (0.04)
- Hubei Province > Wuhan (0.04)
- North America
- Canada > Quebec
- Montreal (0.04)
- United States > Minnesota
- Hennepin County > Minneapolis (0.04)
- Canada > Quebec
- Africa > Middle East
- Genre:
- Instructional Material
- Course Syllabus & Notes (0.84)
- Online (0.61)
- Research Report > New Finding (1.00)
- Instructional Material
- Industry:
- Technology: