Reviews: Towards Text Generation with Adversarially Learned Neural Outlines

Neural Information Processing Systems 

Update after author response: It's good to see the efforts of conducting human evaluation, and I encourage the authors to include more details, e.g., how the annotators are selected, what questions are asked, into the next revision. This manuscript proposes a generative adversarial approach for text generation. Specifically, sentences are encoded into vectors in a "latent space" by a RNN encoder; a decoder conditions on the vector and generates text auto-regressively. The latent representations are adversarially regularized towards a fixed prior. The method is evaluated on both unconditional and conditional generation settings, and the paper argues for better performance than baselines.