Review for NeurIPS paper: Sampling-Decomposable Generative Adversarial Recommender

Neural Information Processing Systems 

Summary and Contributions: This paper analyzed well-known GAN based information retrieval framework IRGAN in the recommendation setting. It proposed multiple interesting modifications that significantly improve its training efficiency and scalability for recommendation tasks. Specifically, the paper first pointed out two problems of IRGAN: (1) simple GAN objective could cause the optimal negative sampler biases to extreme cases (delta distributions), (2) Sampling from the optimal negative sampler is computationally expensive. For addressing (1), the paper proposed to add an entropy regularization that smooth the negative sampler distribution (optimal). For addressing (2), the paper suggested using self-normalized important sampling to approximate optimal negative sampler found in (1), where sampling from proposed distribution could be decomposed into two-step categorical sampling. Further, the paper described a strategy for learning proposed distribution by minimizing estimation variance through a constrained optimization.