Участник:Beznosikov.an
Материал из MachineLearning.
Безносиков Александр
МФТИ, ФУПМ, 674 группа
email: beznosikov.an@phystech.edu
Кафедра: "Интеллектуальные системы"
Направление: "Интеллектуальный анализ данных"
Отчеты о научно-исследовательской работе
Осень 2019, 7-й семестр
Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization
In this paper, we propose new derivative-free method which is based on the Sliding Algorithm from Lan (2016, 2019) for the convex composite optimization problem that includes two terms: smooth one and non-smooth one. We prove the convergence rate for the new method that matches the corresponding rate for the first-order method up to a factor proportional to the dimension of the space. We apply this method for the decentralized distributed optimization and prove the bounds for the number of communication rounds for this method that matches the lower bounds. We prove the bound for the number of zeroth-order oracle calls per node that matches the similar state-of-the-art bound for the first-order decentralized distributed optimization up to to the factor proportional to the dimension of the space.
Публикация
- Aleksandr Beznosikov, Eduard Gorbunov, Alexander Gasnikov Derivative-Free Method For Decentralized Distributed Non-Smooth Optimization // Arxiv preprint. — 2019.