导航
学术报告|
当前位置:首页  科研  学术报告
报告题目: Learning Theory of Stochastic Gradient Descent
报 告 人: 雷云文
报告人所在单位: University of Birmingham
报告日期: 2021-10-13
报告时间: 16:50-17:50
报告地点: 腾讯会议ID:669602531, 密码:200433
   
报告摘要:
Stochastic gradient descent (SGD) has become the workhorse behind many machine learning problems. Optimization and sampling errors are two contradictory factors responsible for the statistical behavior of SGD. In this talk, we report our generalization analysis of SGD by considering simultaneously the optimization and sampling errors. We remove some restrictive assumptions in the literature and significantly improve the existing generalization bounds. Our results help to understand how to stop SGD early to get a best statistical behavior. 
10-13海报.pdf


   
本年度学院报告总序号: 243

Copyright © |2012 复旦大学数学科学学院版权所有 沪ICP备042465