Глубинное обучение (курс лекций)/2020
Материал из MachineLearning.
(Различия между версиями)
(→Lectures and seminars) |
(→Lectures and seminars) |
||
Строка 51: | Строка 51: | ||
|- | |- | ||
| 16 Oct. 2020 || align="center"| 6 || Neural style transfer. || [https://yadi.sk/i/Hp9wbpaIEHz_pw Presentation] | | 16 Oct. 2020 || align="center"| 6 || Neural style transfer. || [https://yadi.sk/i/Hp9wbpaIEHz_pw Presentation] | ||
+ | |- | ||
+ | | 23 Oct. 2020 || align="center"| 7 || Recurrent neural networks. || [https://drive.google.com/file/d/1KvSzzctOjRhYwJH_9LJJeZhMp4USTcDV/view?usp=sharing Presentation] | ||
|- | |- | ||
|} | |} |
Версия 08:51, 27 октября 2020
This is an introductory course on deep learning models and their application for solving different applied problems of image and text analysis.
Instructors: Dmitry Kropotov, Victor Kitov, Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.
The timetable in Autumn 2020: Fridays, lectures begin at 10-30, seminars begin at 12-15, zoom-link
Lectures and seminars video recordings: link
Anytask invite code: ldQ0L2R
Course chat in Telegram: link
Rules and grades
TBA
Lectures and seminars
Date | No. | Topic | Materials |
---|---|---|---|
11 Sep. 2020 | 1 | Introduction. Fully-connected networks. | |
Matrix calculus, automatic differentiation. | Synopsis | ||
18 Sep. 2020 | 2 | Stochastic optimization for neural networks, drop out, batch normalization. | |
Convolutional neural networks, basic architectures. | Presentation | ||
25 Sep. 2020 | 3 | Pytorch and implementation of convolutional neural networks. | ipynb 1 ipynb 2 |
02 Oct. 2020 | 4 | Semantic image segmentation | Presentation (pdf) Portrait Demo (source) |
09 Oct. 2020 | 5 | Object detection | Presentation (pdf) DS Bowl 2018 (pdf) |
16 Oct. 2020 | 6 | Neural style transfer. | Presentation |
23 Oct. 2020 | 7 | Recurrent neural networks. | Presentation |