Please use this identifier to cite or link to this item: https://open.uns.ac.rs/handle/123456789/17940
DC FieldValueLanguage
dc.contributor.authorBajović Dragana-
dc.contributor.authorJakovetić Dušan-
dc.contributor.authorKrejić Nataša-
dc.contributor.authorKrklec Jerinkić Nataša-
dc.date.accessioned2020-12-13T12:15:59Z-
dc.date.available2020-12-13T12:15:59Z-
dc.date.issued2016-
dc.identifier.issn1053-587X-
dc.identifier.urihttps://open.uns.ac.rs/handle/123456789/17940-
dc.description.abstract© 1991-2012 IEEE. We consider distributed optimization where N nodes in a connected network minimize the sum of their local costs subject to a common constraint set. We propose a distributed projected gradient method where each node, at each iteration k, performs an update (is active) with probability pk, and stays idle (is inactive) with probability 1-pk. Whenever active, each node performs an update by weight-averaging its solution estimate with the estimates of its active neighbors, taking a negative gradient step with respect to its local cost, and performing a projection onto the constraint set; inactive nodes perform no updates. Assuming that nodes' local costs are strongly convex, with Lipschitz continuous gradients, we show that, as long as activation probability pk grows to one asymptotically, our algorithm converges in the mean square sense (MSS) to the same solution as the standard distributed gradient method, i.e., as if all the nodes were active at all iterations. Moreover, when pk grows to one linearly, with an appropriately set convergence factor, the algorithm has a linear MSS convergence, with practically the same factor as the standard distributed gradient method. Simulations on both synthetic and real world data sets demonstrate that, when compared with the standard distributed gradient method, the proposed algorithm significantly reduces the overall number of per-node communications and per-node gradient evaluations (computational cost) for the same required accuracy.-
dc.language.isoen-
dc.relation.ispartofIEEE Transactions on Signal Processing-
dc.sourceCRIS UNS-
dc.source.urihttp://cris.uns.ac.rs-
dc.titleDistributed Gradient Methods with Variable Number of Working Nodes-
dc.typeJournal/Magazine Article-
dc.identifier.doi10.1109/TSP.2016.2560133-
dc.identifier.scopus2-s2.0-84978974861-
dc.identifier.urlhttps://www.cris.uns.ac.rs/record.jsf?recordId=105382&source=BEOPEN&language=en-
dc.identifier.urlhttps://api.elsevier.com/content/abstract/scopus_id/84978974861-
dc.relation.lastpage4095-
dc.relation.firstpage4080-
dc.relation.issue15-
dc.relation.volume64-
dc.identifier.externalcrisreference(BISIS)105382-
item.grantfulltextnone-
item.fulltextNo Fulltext-
crisitem.author.deptFakultet tehničkih nauka, Departman za energetiku, elektroniku i telekomunikacije-
crisitem.author.deptPrirodno-matematički fakultet, Departman za matematiku i informatiku-
crisitem.author.deptPrirodno-matematički fakultet, Departman za matematiku i informatiku-
crisitem.author.deptPrirodno-matematički fakultet, Departman za matematiku i informatiku-
crisitem.author.orcid0000-0003-3348-7233-
crisitem.author.orcidhttps://orcid.org/0000-0001-5195-9295-
crisitem.author.orcid0000-0001-5195-9295-
crisitem.author.parentorgFakultet tehničkih nauka-
crisitem.author.parentorgPrirodno-matematički fakultet-
crisitem.author.parentorgPrirodno-matematički fakultet-
crisitem.author.parentorgPrirodno-matematički fakultet-
Appears in Collections:FTN Publikacije/Publications
Show simple item record

Page view(s)

25
Last Week
7
Last month
0
checked on May 10, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.