NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:2410
Title:Training Language GANs from Scratch


		
This paper has required quite a bit of discussion between the reviewers. The concerns were that each individual technique proposed in the paper has been tried in the past. However, their combination enabled something which has not been shown before: training a decent text GAN model without MLE pre-training. While the submission does not provide a convincing argument for switching from MLE to GAN in text generation, it is still an important paper. While some may question where the text GAN direction will ever deliver state-of-the-art language generation models, it is an active area. Many researchers over the last couple of years have tried to train GANs from scratch and failed. I can see that this paper should be a must read for anyone working on this topic. I also find the paper well and also honestly written, it does not oversell GANs and carefully compares the model to MLE training.