Talks
Deep learning basics April 24 |
Slides | |||
Convolutional neural networks May 08 |
||||
Recurrent neural networks May 15 |
||||
Graph neural networks May 22 |
||||
Autoencoders May 29 |
Slides | Code | ||
Generative models June 05 |
||||
Transfomers June 12 |
Written Assignments
Course Information
Wednesdays, 17:15 R114, ITP, Brüderstraße 16 (Talks) |
|
April 10, 2024, 17:15, R114, ITP, Brüderstraße 16 (General information) |
1. Deep learning basics (building blocks of NNs, training, fully connected networks) | Ref. [1] (Sec. 3,4,7) Ref. [2] (Sec. 9) |
2. Convolutional neural networks | Ref. [1] (Sec. 8) Ref. [2] (Sec. 10) |
3. Recurrent neural networks | Ref. [1] (Sec. 9) |
4. Graph networks | Ref. [1] (Sec. 10) |
5. Interpretability of networks | Ref. [1] (Sec. 12) |
6. Objective functions | Ref. [1] (Sec. 14) |
7. Unsupervised learning | Ref. [2] (Sec. 12, 13) |
8. Autoencoders | Ref. [1] (Sec. 17) Ref. [2] (Sec. 17) |
9. Generative models | Ref. [1] (Sec. 18) Ref. [2] (Sec. 15, 16) |
10. Transformers, "Attention is all you need" | Ref. [3] |
References |
[1] | Deep learning for physics research, Martin Erdmann, Jonas Glombitza, Gregor Kasieczka, and Uwe Klemradt, DOI 10.1142/12294 |
[2] | Physics Reports 810 1–124 (2019), A high-bias, low-variance introduction to Machine Learning for physicists |
[3] | Advances in neural information processing systems 30 (2017), Attention is all you need., Vaswani, Ashish, et al. |