Skip to: Site menu | Main content

Talks

David Haink
The inverse variance-flatness relation in stochastic gradient descent
Wednesday 01.06.2022
Slides Code MNIST Code CIFAR
t.b.a.
The inverse variance-flatness relation in stochastic gradient descent: Part II
Wednesday 08.06.2022
Slides Slides Code
t.b.a.
Geometry of Neural Network Loss Surfaces via Random Matrix Theory
Wednesday 25.05.2022
Slides Slides Code
t.b.a.
High-dimensional dynamics of generalization error in neural networks
Wednesday 01.06.2022
Slides Slides Code
t.b.a.
Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization
Wednesday 08.06.2022
Slides Slides Code
t.b.a.
Modeling the Influence of Data Structure on Learning in Neural Networks: the Hidden Manifold Model
Wednesday 15.06.2022
Slides Slides Code

Written Assignments

Course Information

Talks take place Wednesdays 15:15-16:45, SR 114 (ITP Brüderstraße 16)

The first meeting will take place on April 6th.

Topic Reference
1. The inverse variance-flatness relation in stochastic gradient descent Feng et al., PNAS 118, 9 (2021) and Supplement
2. Geometry of Neural Network Loss Surfaces via Random Matrix Theory Pennington et al., PMLR 70, 2798 (2017) and Supplement
3. High-dimensional dynamics of generalization error in neural networks Advani et al., Neural Networks 132, 428 (2020)
4. Statistical Mechanics of Deep Linear Neural Networks: The Backpropagating Kernel Renormalization Li et al., PRX 11, 031059 (2021)
5. Modeling the Influence of Data Structure on Learning in Neural Networks: the Hidden Manifold Model Goldt et al., PRX 10, 041044 (2020)