Extraction multi-\'etiquettes de relations en utilisant des couches de Transformer
Le, Ngoc Luyen, Ngompé, Gildas Tagny
–arXiv.org Artificial Intelligence
In this article, we present the BTransformer18 model, a deep learning architecture designed for multi-label relation extraction in French texts. Our approach combines the contextual representation capabilities of pre-trained language models from the BERT family - such as BERT, RoBERTa, and their French counterparts CamemBERT and FlauBERT - with the power of Transformer encoders to capture long-term dependencies between tokens. Experiments conducted on the dataset from the TextMine'25 challenge show that our model achieves superior performance, particularly when using CamemBERT-Large, with a macro F1 score of 0.654, surpassing the results obtained with FlauBERT-Large. These results demonstrate the effectiveness of our approach for the automatic extraction of complex relations in intelligence reports.
arXiv.org Artificial Intelligence
Feb-21-2025
- Country:
- Europe > France > Hauts-de-France > Oise > Compiègne (0.05)
- Genre:
- Research Report (0.70)
- Technology: