Presentation Name: Training Neural Networks and Mean-field Langevin dynamics
Presenter: 任振杰
Date: 2020-12-15
Location: Zoom会议ID: 610 473 98860,密码:123456
Abstract:
The neural networks have become an extremely useful tool in various applications such as statistical learning and sampling. The empirical success urges a theoretical investigation based on mathematical models. Recently it has become popular to treat the training of the neural networks as an optimization on the space of probability measures. In this talk we show that the optimizer of such optimization can be approximated using the so-called mean-field Langevin dynamics. This theory sheds light on the efficiency of the (stochastic) gradient descent algorithm for training the neural networks. Based on the theory, we also propose a new algorithm for training the generative adversarial networks (GAN), and test it to produce sampling of simple probability distributions.

海报

Annual Speech Directory: No.352

220 Handan Rd., Yangpu District, Shanghai ( 200433 )| Operator:+86 21 65642222

Copyright © 2016 FUDAN University. All Rights Reserved