Математические методы прогнозирования (лекции, А.В. Грабовой, В.В. Стрижов)/Осень 2021

Материал из MachineLearning.

Перейти к: навигация, поиск

Mathematical methods of forecasting

The lecture course and seminar introduces and applies methods of modern physics to the problems of machine learning.

Minimum topics to discuss: Geometric deep learning approach.

Optimum topics to discuss are: tensors, differential forms, Riemannian and differential geometry, metrics, differential operators in various spaces, embeddings, manifolds, bundles. We investigate scalar, vector and tensor fields (as well as jets, fibers and shiefs, tensor bundles, sheaf bundles etc.). The fields and spaces are one-dimensional, multidimensional and continuously dimensional.

Содержание

Grading

  • Questionnaires during lectures (4)
  • Two application projects (2+2)
  • The final exam: problems with discussion (1)
  • Bonus for active participation (2)

Deadlines

  • October 14th, (October 7th one-week before preliminary show is strongly advised)
  • December 2nd, (November 25th one-week before preliminary show is strongly advised)

Lab work report and talk

  1. Title and motivated abstract
  2. Problem statement
  3. Problem solution
  4. Link to the code
  5. Analysis and illustrative plots
  6. References

The report template is here. Please follow the instructions in the template.

Themes

BCI, Matrix and tensor approximation

  1. Коренев, Г.В. Тензорное исчисление, 2000, 240 с., lib.mipt.ru.
  2.  Roger Penrose, "Applications of negative dimensional tensors," in Combinatorial Mathematics and its Applications, Academic Press (1971). See Vladimir Turaev, Quantum invariants of knots and 3-manifolds (1994), De Gruyter, p. 71 for a brief commentary PDF.
  3. Tai-Danae Bradley, At the Interface of Algebra and Statistics, 2020, ArXiv.
  4. Oseledets, I.V. Tensor-Train Decomposition //SIAM Journal on Scientific Computing, 2011, 33(5): 2295–2317, DOI, RG, lecture, GitHub, Tutoiral.
  5. Wikipedia: SVD, Multilinear subspace learning, HOSVD.

BCI, Feature selection

  1. Мотренко А.П. Выбор моделей прогнозирования мультикоррелирующих временных рядов (диссертация), 2019 PDF
  2. Исаченко Р.В. Снижение размерности пространства в задачах декодирования сигналов (дисссертация), 2021 PDF

High order partial least squares

  1. Qibin Zhao, et al. and A. Cichocki, Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method // IEEE Transactions on Pattern Analysis and Machine Intelligence, July 2013, pp. 1660-1673, vol. 35, DOI, ArXiv.

Neural ODEs and Continuous normalizing flows

  1. Ricky T. Q. Chen et al., Neural Ordinary Differential Equations // NIPS, 2018, ArXiv, source paper and code
  2. Johann Brehmera and Kyle Cranmera, Flows for simultaneous manifold learning and density estimation // NIPS, 2020, ArXiv
  3. Flows at deepgenerativemodels.github.io
  4. Flow-based deep generative models
  5. Variational Inference with Normalizing Flows (source paper, Goes to BME)
  6. Знакомство с Neural ODE на хабре, W: Flow-based generative model

Continous time representation

  1. Самохина Алина, Непрерывное представление времени в задачах декодирования сигналов (магистерская диссертация): 2021 PDF, GitHub
  2. Aaron R Voelker, Ivana Kajić, Chris Eliasmith, Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks // NIPS, 2019, PDF,PDF.
  3. Functional data analysis: splines


Вопросы первого семестра

  1. Проблемы прогнозирования и постановки задач прогнозирования
  2. Авторегрессионные модели
  3. Гусеница, выбор компонент
  4. Модели прогнозирования временных рядов с высокой дисперсией ошибки
  5. Учет ошибки при прогнозировании
Личные инструменты