Глубинное обучение (курс лекций)/2020

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
Строка 38: Строка 38:
|-
|-
| Matrix calculus, automatic differentiation. || [https://drive.google.com/file/d/1Yu790uIPyxp9JIyysxfJDor_LJQu83gQ/view?usp=sharing Synopsis]
| Matrix calculus, automatic differentiation. || [https://drive.google.com/file/d/1Yu790uIPyxp9JIyysxfJDor_LJQu83gQ/view?usp=sharing Synopsis]
 +
|-
 +
| rowspan="2"|18 Sep. 2020 || rowspan="2"|2 || Stochastic optimization for neural networks, drop out, batch normalization. ||
 +
|-
 +
| Convolutional neural networks, basic architectures. || [https://drive.google.com/file/d/1uSVdPsn5wznk510gS9N1K9DXITpxNFXt/view?usp=sharing Presentation]
 +
|-
|}
|}

Версия 14:20, 18 сентября 2020

This is an introductory course on deep learning models and their application for solving different applied problems of image and text analysis.

Instructors: Dmitry Kropotov, Victor Kitov, Nadezhda Chirkova, Oleg Ivanov and Evgeny Nizhibitsky.

The timetable in Autumn 2020: Fridays, lectures begin at 10-30, seminars begin at 12-15, zoom-link

Lectures and seminars video recordings: link

Anytask invite code: ldQ0L2R

Course chat in Telegram: link

Rules and grades

TBA

Lectures and seminars

Date No. Topic Materials
11 Sep. 2020 1 Introduction. Fully-connected networks.
Matrix calculus, automatic differentiation. Synopsis
18 Sep. 2020 2 Stochastic optimization for neural networks, drop out, batch normalization.
Convolutional neural networks, basic architectures. Presentation


Arxiv

2019

2017

2016

Личные инструменты