Decentralized Optimization with Topology-Independent Communication
Y. Lin, Y. Kuang, A. Alacaoglu, M. P. Friedlander. arXiv:2509.14488,
2025.
[abs]
[bib]
Distributed optimization requires nodes to coordinate, yet full synchronization scales poorly. When $n$ nodes collaborate through $m$ pairwise regularizers, standard methods demand $\mathcal{O}(m)$ communications per iteration. This paper proposes randomized local coordination: each node independently samples one regularizer uniformly and coordinates only with nodes sharing that term. This exploits partial separability, where each regularizer $G_j$ depends on a subset $S_j \subseteq \{1,\ldots,n\}$ of nodes. For graph-guided regularizers where $|S_j|=2$, expected communication drops to exactly 2 messages per iteration. This method achieves $\tilde{\mathcal{O}}(\varepsilon^{-2})$ iterations for convex objectives and under strong convexity, $\mathcal{O}(\varepsilon^{-1})$ to an $\varepsilon$-solution and $\mathcal{O}(\log(1/\varepsilon))$ to a neighborhood. Replacing the proximal map of the sum $\sum_j G_j$ with the proximal map of a single randomly selected regularizer $G_j$ preserves convergence while eliminating global coordination. Experiments validate both convergence rates and communication efficiency across synthetic and real-world datasets.
@misc{LinKuangAlacaogluFriedlander2025,
author = {Lin, Ying and Kuang, Yao and Alacaoglu, Ahmet and Friedlander, Michael P.},
title = {Decentralized Optimization with Topology-Independent Communication},
year = {2025},
eprint = {2509.14488},
archiveprefix = {arXiv},
primaryclass = {cs.LG},
doi = {10.48550/arXiv.2509.14488}
}