NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:5181
Title:Convergence-Rate-Matching Discretization of Accelerated Optimization Flows Through Opportunistic State-Triggered Control


		
This paper proposes to use ideas from opportunistic state-triggered control to discretize ODEs for accelerated optimization algorithms. Several reviewers praised the originality of the work, stating that "the originality of the paper is basically beyond doubt" and that "the techniques used are new and relevant for optimization". Some reviewers also expressed concerns about the comparison of the results to ones from previous works, for instance comparing "the bound obtained in Theorem 3.3 with the existing ones, e.g., those obtained in [23]". The concerns also included how stringent the assumptions were, in particular for the results for the Nesterov case and the need for additional experimental results. The authors submitted a detailed response to the reviewers' comments. After reading the response and updating their reviews, the reviewers feel that the paper would greatly benefit from a detailed quantitative discussion and comparison of the obtained bounds with those from [23]. The reviewers also feel that the twice differentiability assumption should not be needed for the Nesterov case; this deserves further inspection in order to assume continuous differentiability instead. An additional expert opinion was sought, leaning towards "accepting the paper for the novelty of the approach and the results". All in all, the originality of this work, within this active area of research, is clear. Furthermore, while the current results may seem improvable, the proposed viewpoint and techniques open new venues for research in the area of high-resolution ODEs for first-order optimization. We strongly recommend to the authors to take the reviewers' comments and suggestions into account while preparing the camera-ready final version of the paper. Accept.