Low-Resource Neural Machine Translation Using Recurrent Neural Networks and Transfer Learning: A Case Study on English-to-Igbo
Ekle, Ocheme Anthony, Das, Biswarup
–arXiv.org Artificial Intelligence
In this study, we develop Neural Machine Translation (NMT) and Transformer-based transfer learning models for English-to-Igbo translation - a low-resource African language spoken by over 40 million people across Nigeria and West Africa. Our models are trained on a curated and benchmarked dataset compiled from Bible corpora, local news, Wikipedia articles, and Common Crawl, all verified by native language experts. We leverage Recurrent Neural Network (RNN) architectures, including Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), enhanced with attention mechanisms to improve translation accuracy. To further enhance performance, we apply transfer learning using MarianNMT pre-trained models within the SimpleTransformers framework. Our RNN-based system achieves competitive results, closely matching existing English-Igbo benchmarks. With transfer learning, we observe a performance gain of +4.83 BLEU points, reaching an estimated translation accuracy of 70%. These findings highlight the effectiveness of combining RNNs with transfer learning to address the performance gap in low-resource language translation tasks.
arXiv.org Artificial Intelligence
Apr-25-2025
- Country:
- Africa
- Nigeria
- Ebonyi State > Abakaliki (0.04)
- Imo State (0.04)
- Sudan (0.28)
- West Africa (0.24)
- Nigeria
- Asia
- Europe
- Finland > Uusimaa
- Helsinki (0.04)
- Russia > Central Federal District
- Moscow Oblast > Moscow (0.04)
- Finland > Uusimaa
- North America > United States
- California (0.04)
- New York (0.04)
- Tennessee > Putnam County
- Cookeville (0.04)
- South America > Chile
- Africa
- Genre:
- Overview (1.00)
- Research Report > New Finding (1.00)
- Industry:
- Government > Regional Government
- Africa Government (0.46)
- Law (0.67)
- Government > Regional Government
- Technology: