A Unified Analysis of Federated Learning with Arbitrary Client Participation

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled A Unified Analysis of Federated Learning with Arbitrary Client Participation, by Shiqiang Wang and 1 other authors

View PDF
HTML (experimental)

Abstract:Federated learning (FL) faces challenges of intermittent client availability and computation/communication efficiency. As a result, only a small subset of clients can participate in FL at a given time. It is important to understand how partial client participation affects convergence, but most existing works have either considered idealized participation patterns or obtained results with non-zero optimality error for generic patterns. In this paper, we provide a unified convergence analysis for FL with arbitrary client participation. We first introduce a generalized version of federated averaging (FedAvg) that amplifies parameter updates at an interval of multiple FL rounds. Then, we present a novel analysis that captures the effect of client participation in a single term. By analyzing this term, we obtain convergence upper bounds for a wide range of participation patterns, including both non-stochastic and stochastic cases, which match either the lower bound of stochastic gradient descent (SGD) or the state-of-the-art results in specific settings. We also discuss various insights, recommendations, and experimental results.

Submission history

From: Shiqiang Wang [view email]
[v1]
Thu, 26 May 2022 21:56:31 UTC (1,938 KB)
[v2]
Wed, 1 Jun 2022 02:32:40 UTC (1,938 KB)
[v3]
Thu, 27 Oct 2022 00:21:25 UTC (2,528 KB)
[v4]
Sun, 29 Dec 2024 05:00:26 UTC (2,528 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.