Reviews: Training Language GANs from Scratch

Neural Information Processing Systems 

This paper has required quite a bit of discussion between the reviewers. The concerns were that each individual technique proposed in the paper has been tried in the past. However, their combination enabled something which has not been shown before: training a decent text GAN model without MLE pre-training. While the submission does not provide a convincing argument for switching from MLE to GAN in text generation, it is still an important paper. While some may question where the text GAN direction will ever deliver state-of-the-art language generation models, it is an active area.