Sequence Modeling with Unconstrained Generation Order
Emelianenko, Dmitrii, Voita, Elena, Serdyukov, Pavel
–Neural Information Processing Systems
The dominant approach to sequence generation is to produce a sequence in some predefined order, e.g. In contrast, we propose a more general model that can generate the output sequence by inserting tokens in any arbitrary order. Our model learns decoding order as a result of its training procedure. Our experiments show that this model is superior to fixed order models on a number of sequence generation tasks, such as Machine Translation, Image-to-LaTeX and Image Captioning. Papers published at the Neural Information Processing Systems Conference.
Neural Information Processing Systems
Mar-18-2020, 23:46:05 GMT
- Technology: