A dual approach for federated learning
Z. Fan, H. Fang, M. P. Friedlander. arXiv 2201.11183,
2022.
[abs]
[bib]
[arXiv]
[GitHub]
[DOI]
We study the federated optimization problem from a dual perspective and propose a new algorithm termed federated dual coordinate descent (FedDCD), which is based on a type of coordinate descent method developed by Necora et al. [Journal of Optimization Theory and Applications, 2017]. Additionally, we enhance the FedDCD method with inexact gradient oracles and Nesterov’s acceleration. We demonstrate theoretically that our proposed approach achieves better convergence rates than the state-of-the-art primal federated optimization algorithms under mild conditions. Numerical experiments on real-world datasets support our analysis.
@misc{Fan2022Dual,
Author = {Z. Fan and H. Fang and M. P. Friedlander},
Year = {2022},
Month = {January},
Doi = {10.48550/arxiv.2201.11183},
Title = {A dual approach for federated learning}
}