NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID: 2260 A Prior of a Googol Gaussians: a Tensor Ring Induced Prior for Generative Models

The paper introduces a novel way of parameterizing a mixture of Gaussians with exponentially many modes by using the Tensor Train decomposition to capture the dependence between the mixing variables of the per-dimension 1D Gaussian mixtures. The resulting distribution, which supports efficient marginalization and conditioning, is then used as a prior in VAEs and GANs. The reviewers agreed that the idea is novel and interesting and the paper is well written. The authors have addressed in the rebuttal some of the reviewer concerns about the mismatch in the number of parameters in the proposed prior and the baseline priors. The main remaining weakness of the paper is the lack baselines with strong priors (e.g. autoregressive or expressive flows), though the flow-based baseline provided in the rebuttal is a reasonable first step in that direction.