wist
A New Coca-Cola Flavor at the End of the World
Coca-Cola often experiments with new flavors, and they're usually flavors you can imagine, having tasted them before: vanilla, cherry, lemon. But the latest is called Y3000, a reference to the far-off year 3000, and one that Coca-Cola says was concocted with the help of, in some way, artificial intelligence. It smells like circus-peanut candies and tastes mostly like Coke. The company says this soda was made to evoke a "positive future," with a label that has "a futuristic feel," due to its color palette of silver, violet, magenta, and cyan. The Coca-Cola logo on the Y3000 bottle is made of "fluid dot clusters that merge to represent the human connections of our future planet."
- North America > United States > Virginia (0.05)
- North America > United States > District of Columbia > Washington (0.05)
Twist Decoding: Diverse Generators Guide Each Other
Kasai, Jungo, Sakaguchi, Keisuke, Bras, Ronan Le, Peng, Hao, Lu, Ximing, Radev, Dragomir, Choi, Yejin, Smith, Noah A.
Many language generation models are now available for a wide range of generation tasks, including machine translation and summarization. Combining such diverse models may lead to further progress, but ensembling generation models is challenging during inference: conventional ensembling methods (e.g., shallow fusion) require that the models share vocabulary/tokenization schemes. We introduce Twist decoding, a simple and general text generation algorithm that benefits from diverse models at inference time. Our method does not assume the vocabulary, tokenization or even generation order is shared. Our extensive evaluations on machine translation and scientific paper summarization demonstrate that Twist decoding substantially outperforms each model decoded in isolation over various scenarios, including cases where domain-specific and general-purpose models are both available. Twist decoding also consistently outperforms the popular reranking heuristic where output candidates from one model are rescored by another. We hope that our work will encourage researchers and practitioners to examine generation models collectively, not just independently, and to seek out models with complementary strengths to the currently available models. Our code is available at https://github.com/jungokasai/twist_decoding.
- Research Report > Strength High (0.68)
- Research Report > Experimental Study (0.68)