View a PDF of the paper titled Towards Faster Decentralized Stochastic Optimization with Communication Compression, by Rustem Islamov and 2 other authors
Abstract:Communication efficiency has garnered significant attention as it is considered the main bottleneck for large-scale decentralized Machine Learning applications in distributed and federated settings. In this regime, clients are restricted to transmitting small amounts of quantized information to their neighbors over a communication graph. Numerous endeavors have been made to address this challenging problem by developing algorithms with compressed communication for decentralized non-convex optimization problems. Despite considerable efforts, the current results suffer from various issues such as non-scalability with the number of clients, requirements for large batches, or bounded gradient assumption. In this paper, we introduce MoTEF, a novel approach that integrates communication compression with Momentum Tracking and Error Feedback. Our analysis demonstrates that MoTEF achieves most of the desired properties, and significantly outperforms existing methods under arbitrary data heterogeneity. We provide numerical experiments to validate our theoretical findings and confirm the practical superiority of MoTEF.
Submission history
From: Rustem Islamov [view email]
[v1]
Thu, 30 May 2024 14:51:57 UTC (804 KB)
[v2]
Mon, 25 Nov 2024 09:00:40 UTC (5,522 KB)
Source link
lol