NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:323
Title:RSN: Randomized Subspace Newton


		
This paper considers a class of randomized subspace Newton methods, allowing sketching of the Hessian from arbitrary subspaces, and give a systematic convergence analysis. The reviews agree on the technical contributions, but relatively conservative on originality. In addition, the numerical experiments against batch gradient and accelerated gradient methods are not very convincing, since the state-of-arts for solving considered problems are randomized incremental or coordinate descent methods and their accelerated variants.