Auxiliary-Hyperparameter-Free Sampling: Entropy Equilibrium for Text Generation
Cai, Xiaodong, Lin, Hai, Zhan, Shaoxiong, Luo, Weiqi, Kim, Hong-Gee, Hao, Hongyan, Yang, Yu, Zheng, Hai-Tao
–arXiv.org Artificial Intelligence
Token sampling strategies critically influence text generation quality in large language models (LLMs). However, existing methods introduce additional hyperparameters, requiring extensive tuning and complicating deployment. We present Entropy Equilibrium Sampling (EES), an auxiliary hyperparameter-free approach inspired by information theory that can dynamically adjust candidate sets by balancing normalized entropy with probability mass. We evaluate EES on both reasoning and generation tasks across a range of model architectures. Our results show that EES consistently performs well across temperature settings, delivering competitive accuracy and coherence while maintaining diversity. By eliminating the need for hyperparameter tuning, EES greatly simplifies deployment while improving performance. Code is available at https://github.com/shuanncai/EES
arXiv.org Artificial Intelligence
Dec-2-2025
- Country:
- Asia
- China > Guangdong Province
- Shenzhen (0.04)
- South Korea > Seoul
- Seoul (0.04)
- China > Guangdong Province
- Europe > Switzerland
- North America > United States
- New York (0.04)
- Asia
- Genre:
- Research Report > New Finding (0.86)
- Technology: