An Analysis of Decoding Methods for LLM-based Agents for Faithful Multi-Hop Question Answering
Murphy, Alexander, Rizvi, Mohd Sanad Zaki, Haussmann, Aden, Nie, Ping, Liu, Guifu, Gema, Aryo Pradipta, Minervini, Pasquale
–arXiv.org Artificial Intelligence
Large Language Models (LLMs) frequently produce factually inaccurate outputs - a phenomenon known as hallucination - which limits their accuracy in knowledge-intensive NLP tasks. Retrieval-augmented generation and agentic frameworks such as Reasoning and Acting (ReAct) can address this issue by giving the model access to external knowledge. However, LLMs often fail to remain faithful to retrieved information. Mitigating this is critical, especially if LLMs are required to reason about the retrieved information. Recent research has explored training-free decoding strategies to improve the faithfulness of model generations. We present a systematic analysis of how the combination of the ReAct framework and decoding strategies (i.e., DeCoRe, DoLa, and CAD) can influence the faithfulness of LLM-generated answers. Our results show that combining an agentic framework for knowledge retrieval with decoding methods that enhance faithfulness can increase accuracy on the downstream Multi-Hop Question Answering tasks. For example, we observe an F1 increase from 19.5 to 32.6 on HotpotQA when using ReAct and DoLa.
arXiv.org Artificial Intelligence
Mar-30-2025
- Country:
- Asia
- Middle East
- Jordan (0.04)
- Republic of Türkiye > Istanbul Province
- Istanbul (0.04)
- Thailand > Bangkok
- Bangkok (0.04)
- Middle East
- Europe
- Middle East > Republic of Türkiye
- Istanbul Province > Istanbul (0.04)
- Netherlands > North Brabant
- 's-Hertogenbosch (0.04)
- Spain > Catalonia
- Barcelona Province > Barcelona (0.04)
- United Kingdom > England
- Greater London > London (0.04)
- Middle East > Republic of Türkiye
- North America
- Canada (0.04)
- Mexico > Mexico City
- Mexico City (0.04)
- United States (0.14)
- Asia
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Leisure & Entertainment > Sports
- Tennis (0.46)
- Media > Film (1.00)
- Leisure & Entertainment > Sports
- Technology: