RU22Fact: Optimizing Evidence for Multilingual Explainable Fact-Checking on Russia-Ukraine Conflict
Zeng, Yirong, Ding, Xiao, Zhao, Yi, Li, Xiangyu, Zhang, Jie, Yao, Chao, Liu, Ting, Qin, Bing
–arXiv.org Artificial Intelligence
Fact-checking is the task of verifying the factuality of a given claim by examining the available evidence. High-quality evidence plays a vital role in enhancing fact-checking systems and facilitating the generation of explanations that are understandable to humans. However, the provision of both sufficient and relevant evidence for explainable fact-checking systems poses a challenge. To tackle this challenge, we propose a method based on a Large Language Model to automatically retrieve and summarize evidence from the Web. Furthermore, we construct RU22Fact, a novel multilingual explainable fact-checking dataset on the Russia-Ukraine conflict in 2022 of 16K samples, each containing real-world claims, optimized evidence, and referenced explanation. To establish a baseline for our dataset, we also develop an end-to-end explainable fact-checking system to verify claims and generate explanations. Experimental results demonstrate the prospect of optimized evidence in increasing fact-checking performance and also indicate the possibility of further progress in the end-to-end claim verification and explanation generation tasks.
arXiv.org Artificial Intelligence
Mar-26-2024
- Country:
- Asia
- Middle East > UAE (0.14)
- Russia (1.00)
- Europe > Ukraine (1.00)
- North America > United States
- Louisiana (0.14)
- Minnesota (0.14)
- Pennsylvania (0.14)
- Asia
- Genre:
- Research Report (0.70)
- Industry:
- Technology: