MEDs for PETs: Multilingual Euphemism Disambiguation for Potentially Euphemistic Terms
Lee, Patrick, Trujillo, Alain Chirino, Plancarte, Diana Cuevas, Ojo, Olumide Ebenezer, Liu, Xinyi, Shode, Iyanuoluwa, Zhao, Yuan, Peng, Jing, Feldman, Anna
–arXiv.org Artificial Intelligence
This study investigates the computational processing of euphemisms, a universal linguistic phenomenon, across multiple languages. We train a multilingual transformer model (XLM-RoBERTa) to disambiguate potentially euphemistic terms (PETs) in multilingual and cross-lingual settings. In line with current trends, we demonstrate that zero-shot learning across languages takes place. We also show cases where multilingual models perform better on the task compared to monolingual models by a statistically significant margin, indicating that multilingual data presents additional opportunities for models to learn about cross-lingual, computational properties of euphemisms. In a follow-up analysis, we focus on universal euphemistic "categories" such as death and bodily functions among others. We test to see whether cross-lingual data of the same domain is more important than within-language data of other domains to further understand the nature of the cross-lingual transfer.
arXiv.org Artificial Intelligence
Jan-25-2024
- Country:
- Asia > Middle East
- UAE (0.14)
- North America > United States (0.28)
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.94)
- Technology: