Look-back Decoding for Open-Ended Text Generation

Xu, Nan, Zhou, Chunting, Celikyilmaz, Asli, Ma, Xuezhe

arXiv.org Artificial Intelligence 

Look-back, an improved decoding algorithm that leverages the Kullback-Leibler divergence Figure 1: Maximum similarity of hidden states and to track the distribution distance between current normalized minimum KL divergence between current and historical decoding steps. Thus Lookback step and history (a) or prefix (b) from GPT2 on 1,000 can automatically predict potential repetitive instances of WikiText-103. Compared with human continuation, phrase and topic drift, and remove tokens (a): repetition has much smaller minKL but that may cause the failure modes, restricting undistinguishable high maxHidden with history text, (b): the next token probability distribution within a pseudo topic drift by switching to continuation of another plausible distance to the history. We perform instance has much higher minKL but similar high decoding experiments on document continuation maxHidden with prefix text.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found