Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/18258
Title: Fast distributed gradient methods
Authors: Jakovetić Dušan 
Joao Xavier
Jose Moura
Issue Date: 2014
Journal: IEEE Transactions on Automatic Control
Abstract: We study distributed optimization problems when N nodes minimize the sum of their individual costs subject to a common vector variable. The costs are convex, have Lipschitz continuous gradient (with constant L), and bounded gradient. We propose two fast distributed gradient algorithms based on the centralized Nesterov gradient algorithm and establish their convergence rates in terms of the per-node communications K and the per-node gradient evaluations k. Our first method, Distributed Nesterov Gradient, achieves rates O(log K/K) and O(log k/k). Our second method, Distributed Nesterov gradient with Consensus iterations, assumes at all nodes knowledge of L and μ (W)-the second largest singular value of the N × N doubly stochastic weight matrix W. It achieves rates O( 1/K2-xi) and O( 1/k2) (ξ >0 arbitrarily small). Further, we give for both methods explicit dependence of the convergence constants on N and W. Simulation examples illustrate our findings. © 2014 IEEE.
URI: https://open.uns.ac.rs/handle/123456789/18258
ISSN: 0018-9286
DOI: 10.1109/TAC.2014.2298712
Appears in Collections:PMF Publikacije/Publications

Show full item record

Page view(s)

21
Last Week
8
Last month
0
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.