Участник:Beznosikov.an

Материал из MachineLearning.

(Различия между версиями)
Перейти к: навигация, поиск
(Безносиков Александр)
Строка 1: Строка 1:
-
 
== Безносиков Александр ==
== Безносиков Александр ==
Строка 9: Строка 8:
'''Направление:''' "Интеллектуальный анализ данных"
'''Направление:''' "Интеллектуальный анализ данных"
 +
 +
 +
== Отчеты о научно-исследовательской работе ==
 +
 +
=== Осень 2019, 7-й семестр===
 +
'''Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization'''
 +
 +
''In this paper, we propose new derivative-free method which is based on the Sliding Algorithm from Lan (2016, 2019) for the convex composite optimization problem that includes two terms: smooth one and non-smooth one. We prove the convergence rate for the new method that matches the corresponding rate for the first-order method up to a factor proportional to the dimension of the space. We apply this method for the decentralized distributed optimization and prove the bounds for the number of communication rounds for this method that matches the lower bounds. We prove the bound for the number of zeroth-order oracle calls per node that matches the similar state-of-the-art bound for the first-order decentralized distributed optimization up to to the factor proportional to the dimension of the space.''
 +
 +
'''Публикация'''
 +
*{{биб.статья
 +
|автор = Aleksandr Beznosikov, Eduard Gorbunov, Alexander Gasnikov
 +
|заглавие = Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization
 +
|издание = Arxiv preprint
 +
|год = 2019
 +
|url = https://arxiv.org/abs/1911.10645
 +
}}

Версия 06:56, 11 декабря 2019

Безносиков Александр

МФТИ, ФУПМ, 674 группа

email: beznosikov.an@phystech.edu

Кафедра: "Интеллектуальные системы"

Направление: "Интеллектуальный анализ данных"


Отчеты о научно-исследовательской работе

Осень 2019, 7-й семестр

Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization

In this paper, we propose new derivative-free method which is based on the Sliding Algorithm from Lan (2016, 2019) for the convex composite optimization problem that includes two terms: smooth one and non-smooth one. We prove the convergence rate for the new method that matches the corresponding rate for the first-order method up to a factor proportional to the dimension of the space. We apply this method for the decentralized distributed optimization and prove the bounds for the number of communication rounds for this method that matches the lower bounds. We prove the bound for the number of zeroth-order oracle calls per node that matches the similar state-of-the-art bound for the first-order decentralized distributed optimization up to to the factor proportional to the dimension of the space.

Публикация

  • Aleksandr Beznosikov, Eduard Gorbunov, Alexander Gasnikov Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization // Arxiv preprint. — 2019.
Личные инструменты