KL-Divergence Guided Temperature Sampling
Chang, Chung-Ching, Reitter, David, Aksitov, Renat, Sung, Yun-Hsuan
–arXiv.org Artificial Intelligence
Temperature sampling is a conventional approach to diversify large language model predictions. As temperature increases, the prediction becomes diverse but also vulnerable to hallucinations -- generating tokens that are sensible but not factual. One common approach to mitigate hallucinations is to provide source/grounding documents and the model is trained to produce predictions that bind to and are attributable to the provided source. It appears that there is a trade-off between diversity and attribution. To mitigate any such trade-off, we propose to relax the constraint of having a fixed temperature over decoding steps, and a mechanism to guide the dynamic temperature according to its relevance to the source through KL-divergence. Our experiments justifies the trade-off, and shows that our sampling algorithm outperforms the conventional top-k and top-p algorithms in conversational question-answering and summarization tasks.
arXiv.org Artificial Intelligence
Nov-29-2023
- Country:
- Europe (1.00)
- North America > United States
- New York (0.18)
- Genre:
- Contests & Prizes (0.94)
- Research Report (1.00)
- Industry:
- Health & Medicine (1.00)
- Law (0.68)
- Leisure & Entertainment > Sports
- Hockey (1.00)
- Technology: