Waste Not, Want Not; Recycled Gumbel Noise Improves Consistency in Natural Language Generation
de Mijolla, Damien, Saddiq, Hannan, Moore, Kim
–arXiv.org Artificial Intelligence
Consistency in the output of language models is critical for their reliability and practical utility. Due to their training objective, language models learn to model the full space of possible continuations, leading to outputs that can vary significantly in style and content, even for similar or repeated inputs. To address this, we propose a novel decoding algorithm that enhances response consistency across different prompts with no degradation in response quality. By incorporating a latent variable into the next-token sampling process based on the Gumbel reparametrisation trick, our method outperforms standard sampling by up to 10% across semantic and stylistic consistency benchmarks.
arXiv.org Artificial Intelligence
Mar-2-2025
- Country:
- Asia
- Middle East > UAE (0.14)
- Thailand (0.14)
- Europe (0.14)
- North America
- Canada (0.14)
- United States > California (0.14)
- Oceania > Australia (0.14)
- Asia
- Genre:
- Research Report (1.00)
- Technology: