The convergence analysis of accelerated first-order methods for convex optimization problems is presented from the point of view of ordinary differential equation (ODE) solvers. We first take another look at the acceleration phenomenon via A-stability theory for ODE solvers and explain it by transforming the spectrum to the complex plane. After that, we present the Lyapunov framework for dynamical systems and introduce the strong Lyapunov condition. This framework addresses and analyzes many existing continuous convex optimization models, such as gradient flow, heavy ball system, Nesterov accelerated gradient flow, and dynamical inertial Newton system, among others.
In the second half of this talk, we extend the acceleration to a class of non-linear saddle point problems. We first develop a transformed primal-dual (TPD) flow in which the flow for the dual variable contains a Schur complement that is strongly convex. We obtain exponential stability of the TPD flow by demonstrating the strong Lyapunov property. Then, we derive TPD iterations using ODE solvers. We propose an accelerated transformed
primal-dual (ATPD) method and prove the accelerated linear rates with optimal lower iteration complexity. Furthermore, we extend the accelerated gradient flow and develop accelerated gradient and skew-symmetric splitting (AGSS) methods for a more general class of monotone operator equations.This is a joint work with Hao Luo (Chongqing Normal University) and Jingrong Wei (UCI).
学术海报.pdf