Antibody Foundational Model : Ab-RoBERTa
Huh, Eunna, Lee, Hyeonsu, Shin, Hyunjin
–arXiv.org Artificial Intelligence
With the growing prominence of antibody - based therapeutics, antibody engineering has gained increasing attention as a critical area of research and development. Recent progress in transformer - based protein large language models (LLMs) has demonstrated prom ising applications in protein sequence design and structural prediction. Moreover, the availability of large - scale antibody datasets such as the Observed Antibody Space (OAS) database has opened new avenues for the development of LLMs specialized for proce ssing antibody sequences . Among these, RoBERTa has demonstrated improved performance relative to BERT, while maintaining a smaller parameter count (125M) compared to the BERT - based protein model, ProtBERT (420M). This reduced model size enables more efficient deployment in antibody - related application s . However, despite the numerous advantages of the RoBERTa architecture, antibody - specific foundational models built upon it have remained inaccessible to the research community. In this study, we introduce Ab - RoBERTa, a RoBERTa - based antibody - specific LLM, which is publicly available at https://huggingface.co/mogam - ai/Ab - RoBERTa . This resource is intended to support a wide range of antibody - related research applications including paratope prediction or humanness assessment .
arXiv.org Artificial Intelligence
Jun-17-2025
- Country:
- Asia > South Korea (0.04)
- Genre:
- Overview > Growing Problem (0.34)
- Research Report > New Finding (0.48)
- Industry:
- Technology: