NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:4163
Title:Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses


		
The paper studies large-scale convex optimization algorithms based on the Newton method applied to regularized generalized self-concordant losses, in particular in ill-conditioned settings, providing new optimal generalization bounds and proofs of convergence. The reviewers found the contributions of high quality and were satisfied with the clarifications provided by the author response.