Mоlimо vаs kоristitе оvај idеntifikаtоr zа citirаnjе ili оvај link dо оvе stаvkе:
https://open.uns.ac.rs/handle/123456789/17940
Nаziv: | Distributed Gradient Methods with Variable Number of Working Nodes | Аutоri: | Bajović Dragana Jakovetić Dušan Krejić Nataša Krklec Jerinkić Nataša |
Dаtum izdаvаnjа: | 2016 | Čаsоpis: | IEEE Transactions on Signal Processing | Sažetak: | © 1991-2012 IEEE. We consider distributed optimization where N nodes in a connected network minimize the sum of their local costs subject to a common constraint set. We propose a distributed projected gradient method where each node, at each iteration k, performs an update (is active) with probability pk, and stays idle (is inactive) with probability 1-pk. Whenever active, each node performs an update by weight-averaging its solution estimate with the estimates of its active neighbors, taking a negative gradient step with respect to its local cost, and performing a projection onto the constraint set; inactive nodes perform no updates. Assuming that nodes' local costs are strongly convex, with Lipschitz continuous gradients, we show that, as long as activation probability pk grows to one asymptotically, our algorithm converges in the mean square sense (MSS) to the same solution as the standard distributed gradient method, i.e., as if all the nodes were active at all iterations. Moreover, when pk grows to one linearly, with an appropriately set convergence factor, the algorithm has a linear MSS convergence, with practically the same factor as the standard distributed gradient method. Simulations on both synthetic and real world data sets demonstrate that, when compared with the standard distributed gradient method, the proposed algorithm significantly reduces the overall number of per-node communications and per-node gradient evaluations (computational cost) for the same required accuracy. | URI: | https://open.uns.ac.rs/handle/123456789/17940 | ISSN: | 1053-587X | DOI: | 10.1109/TSP.2016.2560133 |
Nаlаzi sе u kоlеkciјаmа: | FTN Publikacije/Publications |
Prikаzаti cеlоkupаn zаpis stаvki
Prеglеd/i stаnicа
25
Prоtеklа nеdеljа
7
7
Prоtеkli mеsеc
0
0
prоvеrеnо 10.05.2024.
Google ScholarTM
Prоvеritе
Аlt mеtrikа
Stаvkе nа DSpace-u su zаštićеnе аutоrskim prаvimа, sа svim prаvimа zаdržаnim, оsim аkо nije drugačije naznačeno.