Глубинное обучение (курс лекций)/2019
Материал из MachineLearning.
This is an introductory course on deep learning models and their application for solving different problems of image and text analysis.
Instructors: Dmitry Kropotov, Victor Kitov, Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.
The timetable in Autumn 2019: Mondays, lectures begin at 10-30, seminars begin at 12-15, room 526b.
For questions: course chat in Telegram
News
09 Sep: Today's lecture is cancelled. Seminar will start normally at 12-15.
06 Sep: First theoretical assignment is uploaded to anytask. Deadline: 15 Sep. Please note: this is a strict deadline, no delay is possible.
01 Oct: Second practical assignment is uploaded to anytask. Deadline: 15 Oct.
Rules and grades
We have 7 home assignments during the course. For each assignment, a student may get up to 10 points + possibly bonus points. For some assignments a student is allowed to upload his fulfilled assignment during one week after deadline with grade reduction of 0.5 points per day. All assignments are prepared in English.
Also each student may give a small 10-minutes talk in English on some recent DL paper. For this talk a student may get up to 5 points.
The total grade for the course is calculated as follows: Round-up (0.3*<Exam_grade> + 0.7*<Semester_grade>), where <Semester_grade> = min(10, (<Assignments_total_grade> + <Talk_grade>) / 7), <Exam_grade> is a grade for the final exam (up to 10 points).
Final grade | Total grade | Necessary conditions |
---|---|---|
5 | >=8 | all practical assignments are done, exam grade >= 6 and oral talk is given |
4 | >=6 | 6 practical assignments are done, exam grade >= 4 |
3 | >=4 | 3 practical assignments are done, exam grade >= 4 |
Practical assignments
Practical assignments are provided on course page in anytask.org. Invite code: IXLOwZU
Lectures
Date | No. | Topic | Materials |
---|---|---|---|
02 Sep. 2019 | 1 | Introduction. Fully-connected networks. | |
16 Sep. 2019 | 2 | Optimization and regularization for neural networks. Convolutional neural networks. | Presentation (pptx) Presentation (pdf) A paper about ADAM A paper about DropOut A paper about BatchNorm |
23 Sep. 2019 | 3 | Semantic image segmentation | Presentation (pdf) Portrait Demo (source) |
30 Sep. 2019 | 4 | Object detection on images | Presentation (pdf) DS Bowl 2018 report (pdf) |
07 Oct. 2019 | 5 | Image style transfer | Presentations 1, 2, 3, 4, 5, 6. |
14 Oct. 2019 | 6 | Recurrent neural networks | Presentation (pdf) |
21 Oct. 2019 | 7 | Attention in recurrent neural networks | A paper about attention model for image captioning A paper about Transformer A paper about BERT |
28 Oct. 2019 | 8 | Variational autoencoder | A paper about VAE |
11 Nov. 2019 | 9 | Generative adversarial networks | Presentation |
18 Nov. 2019 | 10 | Reinforcement learning. Q-learning, DQN. | RL book, chapter 6 A paper about DQN |
25 Nov. 2019 | 11 | Policy gradient in reinforcement learning | RL book, chapter 13 A paper about A3C |
02 Dec. 2019 | 12 | Implicit reparameterization trick. Gumbel-Softmax approach for discrete reparameterization. | A paper about IRT A paper about Gumbel-Softmax |
09 Dec. 2019 | 13 | Students' talks |
Seminars
Date | No. | Topic | Need laptops | Materials |
---|---|---|---|---|
2 Sep. 2019 | 1 | Matrix calculus, automatic differentiation. | No | Synopsis |
9 Sep. 2019 | 2 | Introduction to Pytorch | Yes | ipynb |
16 Sep. 2019 | 3 | Convolutional neural networks on Pytorch | Yes | ipynb |
23 Sep. 2019 | 4 | Semantic image segmentation | No | |
30 Sep. 2019 | 5 | Face recognition | No | |
07 Oct. 2019 | 6 | Image style transfer | No | |
14 Oct. 2019 | 7 | DropOut for recurrent neural networks | Yes | ipynb |
21 Oct. 2019 | 8 | Models with attention | Yes | |
28 Oct. 2019 | 9 | Variational autoencoders, adversarial attacks | Yes | |
11 Nov. 2019 | 10 | Generative adversarial networks | Yes | |
18 Nov. 2019 | 11 | Bandits | No | |
25 Nov. 2019 | 12 | Actor-critic approach in RL | No | |
02 Dec. 2019 | 13 | Semi-supervised discrete VAE | Yes | |
09 Dec. 2019 | 14 | Students' talks | No |