NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:7167
Title:Stochastic Continuous Greedy ++: When Upper and Lower Bounds Match


		
This paper considers DR-submodular maximization under convex constraints. It achieves optimal results in terms of both approximation and query complexity. The first paper in this line of work shows that Stochastic Gradient Ascent (SGA) achieves a 1/2-epsilon approximation in 1/epsilon^2 stochastic gradient. There have been multiple papers on this problem since then. This paper introduces SCG++ which achieves a 1-1/e-epsilon in 1/epsilon^2 stochastic oracle calls using a novel variance reduction method. The authors show a lower bound, proving that this bound is optimal. In addition, the authors also point out corollaries for maximizing a noisy sub modular functions set function subject to matroid constraints.