Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/18249
DC FieldValueLanguage
dc.contributor.authorJakovetić Dušan-
dc.contributor.authorJoao Xavier-
dc.contributor.authorJose Moura-
dc.date.accessioned2020-12-13T12:34:59Z-
dc.date.available2020-12-13T12:34:59Z-
dc.date.issued2014-
dc.identifier.issn1053-587X-
dc.identifier.urihttps://open.uns.ac.rs/handle/123456789/18249-
dc.description.abstractWe consider distributed optimization in random networks where N nodes cooperatively minimize the sum \sum -{i=1}^{N} f-{i}(x) of their individual convex costs. Existing literature proposes distributed gradient-like methods that are computationally cheap and resilient to link failures, but have slow convergence rates. In this paper, we propose accelerated distributed gradient methods that 1) are resilient to link failures; 2) computationally cheap; and 3) improve convergence rates over other gradient methods. We model the network by a sequence of independent, identically distributed random matrices \{W(k)\} drawn from the set of symmetric, stochastic matrices with positive diagonals. The network is connected on average and the cost functions are convex, differentiable, with Lipschitz continuous and bounded gradients. We design two distributed Nesterov-like gradient methods that modify the D-NG and D-NC methods that we proposed for static networks. We prove their convergence rates in terms of the expected optimality gap at the cost function. Let k and {\cal K} be the number of per-node gradient evaluations and per-node communications, respectively. Then the modified D-NG achieves rates O(\log k/k) and O(\log {\cal K}/ {\cal K}) , and the modified D-NC rates O(1/k^{2}) and O(1/ {\cal K}^{2-\xi }), where \xi >0 is arbitrarily small. For comparison, the standard distributed gradient method cannot do better than \Omega (1/k^{2/3}) and \Omega (1/ {\cal K}^{2/3}), on the same class of cost functions (even for static networks). Simulation examples illustrate our analytical findings. © 1991-2012 IEEE.-
dc.language.isoen-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.sourceCRIS UNS-
dc.source.urihttp://cris.uns.ac.rs-
dc.titleConvergence rates of distributed nesterov-like gradient methods on random networks-
dc.typeJournal/Magazine Article-
dc.identifier.doi10.1109/TSP.2013.2291221-
dc.identifier.scopus2-s2.0-84893478851-
dc.identifier.urlhttps://www.cris.uns.ac.rs/record.jsf?recordId=106068&source=BEOPEN&language=en-
dc.identifier.urlhttps://api.elsevier.com/content/abstract/scopus_id/84893478851-
dc.relation.lastpage882-
dc.relation.firstpage868-
dc.relation.issue4-
dc.relation.volume62-
dc.identifier.externalcrisreference(BISIS)106068-
item.grantfulltextnone-
item.fulltextNo Fulltext-
crisitem.author.deptPrirodno-matematički fakultet, Departman za matematiku i informatiku-
crisitem.author.parentorgPrirodno-matematički fakultet-
Appears in Collections:PMF Publikacije/Publications
Show simple item record

Page view(s)

22
Last Week
3
Last month
0
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.