Efficient Wireless Federated Learning via Low-Rank Gradient Factorization

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled Efficient Wireless Federated Learning via Low-Rank Gradient Factorization, by Mingzhao Guo and 3 other authors

View PDF
HTML (experimental)

Abstract:This paper presents a novel gradient compression method for federated learning (FL) in wireless systems. The proposed method centers on a low-rank matrix factorization strategy for local gradient compression that is based on one iteration of a distributed Jacobi successive convex approximation (SCA) at each FL round. The low-rank approximation obtained at one round is used as a “warm start” initialization for Jacobi SCA in the next FL round. A new protocol termed over-the-air low-rank compression (Ota-LC) incorporating this gradient compression method with over-the-air computation and error feedback is shown to have lower computation cost and lower communication overhead, while guaranteeing the same inference performance, as compared with existing benchmarks. As an example, when targeting a test accuracy of 70% on the Cifar-10 dataset, Ota-LC reduces total communication costs by at least 33% compared to benchmark schemes.

Submission history

From: Mingzhao Guo [view email]
[v1]
Mon, 15 Jan 2024 06:30:06 UTC (317 KB)
[v2]
Sat, 23 Nov 2024 02:56:34 UTC (242 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.