Enhancing Biomedical Text Summarization and Question-Answering: On the Utility of Domain-Specific Pre-Training
Galat, Dima, Rizoiu, Marian-Andrei
–arXiv.org Artificial Intelligence
Biomedical summarization requires large datasets to train for text generation. We show that while transfer learning offers a viable option for addressing this challenge, an in-domain pre-training does not always offer advantages in a BioASQ summarization task. We identify a suitable model architecture and use it to show a benefit of a general-domain pre-training followed by a task-specific fine-tuning in the context of a BioASQ summarization task, leading to a novel three-step fine-tuning approach that works with only a thousand in-domain examples. Our results indicate that a Large Language Model without domain-specific pre-training can have a significant edge in some domain-specific biomedical text generation tasks.
arXiv.org Artificial Intelligence
Jul-10-2023
- Country:
- Europe > Greece
- Central Macedonia > Thessaloniki (0.04)
- Oceania > Australia (0.04)
- Europe > Greece
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Health & Medicine (1.00)
- Technology: