NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:1610
Title:On the Global Convergence of (Fast) Incremental Expectation Maximization Methods

This paper studies the convergence of incremental and stochastic Expectation-Maximization (EM). This paper is on the borderline and was carefully discussed. The main concern of the reviewers is that Theorem 1 in this submission is a special case of Theorem 1 in submission 1613 (from the same authors). After some discussions, reviewers recognized that Theorem 2 is more important than Theorem 1, and is the main contribution of this work. Therefore, I recommend reject 1613 and accept this one. Regarding the stochastic variance-reduced EM, the following important reference is missing: [1] Zhu, Rongda, Lingxiao Wang, Chengxiang Zhai, and Quanquan Gu. "High-dimensional variance-reduced stochastic gradient expectation-maximization algorithm." In ICML, 2017.