Участник:Strijov/Drafts
Материал из MachineLearning.
Строка 1: | Строка 1: | ||
{{Main|Численные методы обучения по прецедентам (практика, В.В. Стрижов)}} | {{Main|Численные методы обучения по прецедентам (практика, В.В. Стрижов)}} | ||
{{TOCright}} | {{TOCright}} | ||
+ | |||
+ | == 2021 == | ||
+ | |||
+ | ==Bayesian model selection and multimodeling== | ||
+ | |||
+ | Course page: https://github.com/Intelligent-Systems-Phystech/BMM-21 | ||
+ | |||
+ | The lecture course delivers the main problem of machine learning, the problem of model selection. One can set a heuristic model and optimise its parameters, or select a model from a class, or make a teacher model to transform its knowledge to a student model, or even make an ensemble from a models. Behind all these strategies there is a fundamental technique: the Bayesian inference. It assumes hypotheses about the measured data set, about the model parameters and even about the model structure. And it deduce the error function to optimise. This is called the Minimum Description Length principle. It selects simple, stable and precise models. This course joins the theory and the practical lab works of the model selection and multimodeling. | ||
+ | |||
+ | ==Grading== | ||
+ | * Labs: 6 in total | ||
+ | * Forms: 1 in total | ||
+ | * Reports: 2 in total | ||
+ | |||
+ | The maximum score is 11, so the final score is MIN(10, score) | ||
+ | |||
+ | ==Syllabus== | ||
+ | # 8.09 Intro | ||
+ | # 15.09 Distributions, expectation, likelihood | ||
+ | # 22.09 Bayesian inference | ||
+ | # 29.09 MDL, Minimum description length principle | ||
+ | # 6.10 Probabilistic metric spaces | ||
+ | # 13.10 Generative and discriminative models | ||
+ | # 20.10 Data generation, VAE, GAN | ||
+ | # 27.10 Probabilistic graphical models | ||
+ | # 3.11 Variational inference | ||
+ | # 10.11 Variational inference 2 | ||
+ | # 17.11 Hyperparameter optimization | ||
+ | # 24.11 Meta-optimization | ||
+ | # 1.12 Bayesian PCA, GLM and NN | ||
+ | # 8.12 Gaussian processes | ||
+ | |||
+ | |||
+ | |||
+ | ==References== | ||
+ | ===Books=== | ||
+ | # Bishop | ||
+ | # Barber | ||
+ | # Murphy | ||
+ | # Rasmussen and Williams, of course! | ||
+ | # Taboga(to catch up) | ||
+ | ===Theses=== | ||
+ | # Грабововй А.В. Диссертация. | ||
+ | # [[Участник:Oleg Bakhteev|Бахтеев О.Ю.]]. [http://www.frccsc.ru/sites/default/files/docs/ds/002-073-05/diss/26-bahteev/ds05-26-bahteev_main.pdf?28 Выбор моделей глубокого обучения субоптимальной сложности] [https://github.com/bahleg/tex_phd/raw/master/doc/BakhteevThesis.pdf git], [https://github.com/bahleg/tex_phd/raw/master/doc/BakhteevAuto.pdf автореферат], [https://github.com/bahleg/tex_slides/raw/master/predef_19/BakhteevSlidesShort.pdf презентация (PDF)], [https://www.youtube.com/watch?v=JpLCf15p6jk видео]. 2020. МФТИ. | ||
+ | # [[Участник:Aduenko|Адуенко А.А.]] [https://sourceforge.net/p/mlalgorithms/code/HEAD/tree/PhDThesis/Aduenko2017Multimodels/AduenkoThesis.pdf Выбор мультимоделей в задачах классификации], [[Media:AduenkoThesisPresentation_20170219.pdf| презентация (PDF)]], [https://www.youtube.com/watch?v=HPm8yrc4EtE видео]. 2017. МФТИ. | ||
+ | # [[Участник: Arsenty | Кузьмин А.А.]] [https://sourceforge.net/p/mlalgorithms/code/HEAD/tree/PhDThesis/Kuzmin/main.pdf | Построение иерархических тематических моделей коллекций коротких текстов], [https://sourceforge.net/p/mlalgorithms/code/HEAD/tree/PhDThesis/Kuzmin/Kuzmin2017Presentation.pdf | презентация (PDF)], [https://www.youtube.com/watch?v=65Qx_NyppNo видео]. 2017. МФТИ. | ||
+ | |||
+ | ===Papers=== | ||
+ | # Kuznetsov M.P., Tokmakova A.A., Strijov V.V. Analytic and stochastic methods of structure parameter estimation // Informatica, 2016, 27(3) : 607-624, [http://strijov.com/papers/HyperOptimizationEng.pdf PDF]. | ||
+ | # Bakhteev O.Y., Strijov V.V. Deep learning model selection of suboptimal complexity // Automation and Remote Control, 2018, 79(8) : 1474–1488, [https://link.springer.com/content/pdf/10.1134%2FS000511791808009X.pdf PDF]. | ||
+ | # Bakhteev O.Y., Strijov V.V. Comprehensive analysis of gradient-based hyperparameter optimization algorithmss // Annals of Operations Research, 2020 : 1-15, [http://strijov.com/papers/Bakhteev2017Hypergrad.pdf PDF]. |
Версия 12:33, 6 февраля 2023
|
2021
Bayesian model selection and multimodeling
Course page: https://github.com/Intelligent-Systems-Phystech/BMM-21
The lecture course delivers the main problem of machine learning, the problem of model selection. One can set a heuristic model and optimise its parameters, or select a model from a class, or make a teacher model to transform its knowledge to a student model, or even make an ensemble from a models. Behind all these strategies there is a fundamental technique: the Bayesian inference. It assumes hypotheses about the measured data set, about the model parameters and even about the model structure. And it deduce the error function to optimise. This is called the Minimum Description Length principle. It selects simple, stable and precise models. This course joins the theory and the practical lab works of the model selection and multimodeling.
Grading
- Labs: 6 in total
- Forms: 1 in total
- Reports: 2 in total
The maximum score is 11, so the final score is MIN(10, score)
Syllabus
- 8.09 Intro
- 15.09 Distributions, expectation, likelihood
- 22.09 Bayesian inference
- 29.09 MDL, Minimum description length principle
- 6.10 Probabilistic metric spaces
- 13.10 Generative and discriminative models
- 20.10 Data generation, VAE, GAN
- 27.10 Probabilistic graphical models
- 3.11 Variational inference
- 10.11 Variational inference 2
- 17.11 Hyperparameter optimization
- 24.11 Meta-optimization
- 1.12 Bayesian PCA, GLM and NN
- 8.12 Gaussian processes
References
Books
- Bishop
- Barber
- Murphy
- Rasmussen and Williams, of course!
- Taboga(to catch up)
Theses
- Грабововй А.В. Диссертация.
- Бахтеев О.Ю.. Выбор моделей глубокого обучения субоптимальной сложности git, автореферат, презентация (PDF), видео. 2020. МФТИ.
- Адуенко А.А. Выбор мультимоделей в задачах классификации, презентация (PDF), видео. 2017. МФТИ.
- Кузьмин А.А. | Построение иерархических тематических моделей коллекций коротких текстов, | презентация (PDF), видео. 2017. МФТИ.
Papers
- Kuznetsov M.P., Tokmakova A.A., Strijov V.V. Analytic and stochastic methods of structure parameter estimation // Informatica, 2016, 27(3) : 607-624, PDF.
- Bakhteev O.Y., Strijov V.V. Deep learning model selection of suboptimal complexity // Automation and Remote Control, 2018, 79(8) : 1474–1488, PDF.
- Bakhteev O.Y., Strijov V.V. Comprehensive analysis of gradient-based hyperparameter optimization algorithmss // Annals of Operations Research, 2020 : 1-15, PDF.