Presentation Name: | Stochastic gradient Hamiltonian Monte Carlo for Non-Convex Optimization |
---|---|
Presenter: | Dr. Lingjiong Zhu |
Date: | 2019-05-25 |
Location: | 光华东主楼1403 |
Abstract: | Stochastic non-convex optimization problems arise in many applications including data science. Stochastic gradient Hamiltonian Monte Carlo (SGHMC) is a variant of stochastic gradient with momentum where a controlled and properly scaled Gaussian noise is added to the stochastic gradients to steer the iterates towards a global minimum. Many works reported its empirical success in practice and can outperform overdamped Langevin Monte Carlo-based methods such as stochastic gradient Langevin dynamics (SGLD) in many data science applications. In this talk, we provide finite-time performance bounds for the global convergence of SGHMC for solving stochastic non-convex optimization problems with explicit constants. Our results lead to non-asymptotic guarantees for both population and empirical risk minimization problems. For a fixed target accuracy level, on a class of non-convex problems, we obtain iteration complexity bounds for SGHMC that can be tighter than those for SGLD up to a square root factor. These results show that acceleration with momentum is possible in the context of non-convex optimization algorithms. This is based on the joint work with Xuefeng Gao and Mert Gurbuzbalaban. |
Annual Speech Directory: | No.101 |
220 Handan Rd., Yangpu District, Shanghai ( 200433 )| Operator:+86 21 65642222
Copyright © 2016 FUDAN University. All Rights Reserved