Comparative Approaches to Sentiment Analysis Using Datasets in Major European and Arabic Languages

Krasitskii, Mikhail, Kolesnikova, Olga, Hernandez, Liliana Chanona, Sidorov, Grigori, Gelbukh, Alexander

arXiv.org Artificial Intelligence 

This study explores transformer-based models such as BERT, mBERT, and XLM-R for multilingual sentiment analysis across diverse linguistic structures. Key contributions include the identification of XLM-R's superior adaptability in morphologically complex languages, achieving accuracy levels above 88%. The work highlights fine-tuning strategies and emphasizes their significance for improving sentiment classification in underrepresented languages.