Presentation Name: Join Statistics Seminar of SCMS and SDS:Geometry of non-convex landscapes: Deep learning, matrix completion, and saddle-points
Presenter: Dr. Jason Lee
Date: 2017-12-21
Location: 光华东主楼2201
Abstract:

We show that saddle points are easy to avoid for even Gradient Descent -- arguably the simplest optimization procedure. We prove that with probability 1, randomly initialized Gradient Descent converges to a local minimizer. The same result holds for a large class of optimization algorithms including proximal point, mirror descent, and coordinate descent.

Next, we study the problems of learning a two-layer ReLU network and the matrix completion problem. Despite the non-convexity of both problems, we prove that every local minimizer is a global minimizer. By combining with the previous algorithmic result on gradient descent, this shows that simple gradient-based methods can find the global optimum of these non-convex problems.

海报

Annual Speech Directory: No.298

220 Handan Rd., Yangpu District, Shanghai ( 200433 )| Operator:+86 21 65642222

Copyright © 2016 FUDAN University. All Rights Reserved