Generating Human-level Text with Contrastive Search in Transformers 🤗

#artificialintelligence 

In this blog, we introduce the current state-of-the-art decoding method, Contrastive Search, for neural text generation. Contrastive search is originally proposed in "A Contrastive Framework for Neural Text Generation" [1] ([Paper][Official Implementation]) at NeurIPS 2022. Moreover, in this follow-up work, "Contrastive Search Is What You Need For Neural Text Generation" [2] ([Paper] [Official Implementation]), the authors further demonstrate that contrastive search can generate human-level text using off-the-shelf language models across 16 languages. Contrastive Search is now available on transformers, both on PyTorch and TensorFlow. You can interact with the examples shown in this blog post using your framework of choice in this colab notebook, which is linked at the top.