Skip to: Site menu | Main content

Talks

1. First talk
Deep learning basics
April 24
2. Lecture by Prof. Rosenow
Convolutional neural networks
May 08
3. Felix Stündel
Recurrent neural networks
May 15
4. Johannes Ewald
Graph neural networks
May 22
5. Yoon Thelge
Autoencoders
May 29
6. Hamid Al Zubaidi
Generative models
June 05
7. Stephan Meyer
Transfomers
June 12

Written Assignments

Course Information

Time of talks: Wednesdays, 17:15
R114, ITP, Brüderstraße 16 (Talks)
First meeting: April 10, 2024, 17:15,
R114, ITP, Brüderstraße 16 (General information)


Topic References
1. Deep learning basics (building blocks of NNs, training, fully connected networks) Ref. [1] (Sec. 3,4,7)
Ref. [2] (Sec. 9)
2. Convolutional neural networks Ref. [1] (Sec. 8)
Ref. [2] (Sec. 10)
3. Recurrent neural networks Ref. [1] (Sec. 9)
4. Graph networks Ref. [1] (Sec. 10)
5. Interpretability of networks Ref. [1] (Sec. 12)
6. Objective functions Ref. [1] (Sec. 14)
7. Unsupervised learning Ref. [2] (Sec. 12, 13)
8. Autoencoders Ref. [1] (Sec. 17)
Ref. [2] (Sec. 17)
9. Generative models Ref. [1] (Sec. 18)
Ref. [2] (Sec. 15, 16)
10. Transformers, "Attention is all you need" Ref. [3]
References
[1]    Deep learning for physics research, Martin Erdmann, Jonas Glombitza, Gregor Kasieczka, and Uwe Klemradt, DOI 10.1142/12294
[2]    Physics Reports 810 1–124 (2019), A high-bias, low-variance introduction to Machine Learning for physicists
[3]    Advances in neural information processing systems 30 (2017), Attention is all you need., Vaswani, Ashish, et al.