Look-back Decoding for Open-Ended Text Generation
Xu, Nan, Zhou, Chunting, Celikyilmaz, Asli, Ma, Xuezhe
–arXiv.org Artificial Intelligence
Look-back, an improved decoding algorithm that leverages the Kullback-Leibler divergence Figure 1: Maximum similarity of hidden states and to track the distribution distance between current normalized minimum KL divergence between current and historical decoding steps. Thus Lookback step and history (a) or prefix (b) from GPT2 on 1,000 can automatically predict potential repetitive instances of WikiText-103. Compared with human continuation, phrase and topic drift, and remove tokens (a): repetition has much smaller minKL but that may cause the failure modes, restricting undistinguishable high maxHidden with history text, (b): the next token probability distribution within a pseudo topic drift by switching to continuation of another plausible distance to the history. We perform instance has much higher minKL but similar high decoding experiments on document continuation maxHidden with prefix text.
arXiv.org Artificial Intelligence
Oct-22-2023
- Country:
- Asia > China (0.04)
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- Germany > Bavaria
- Upper Bavaria > Munich (0.04)
- Ireland > Leinster
- County Dublin > Dublin (0.04)
- Belgium > Brussels-Capital Region
- North America > United States
- California (0.14)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- New York (0.04)
- Oceania > Australia
- South America > Brazil
- São Paulo (0.04)
- Genre:
- Research Report (0.82)
- Industry:
- Education > Educational Setting (0.46)
- Leisure & Entertainment
- Games > Computer Games (0.46)
- Sports (0.69)
- Technology: