View a PDF of the paper titled A Comprehensive Framework for Analyzing the Convergence of Adam: Bridging the Gap with SGD, by Ruinan Jin and 3 other authors
Abstract:Adaptive Moment Estimation (Adam) is a cornerstone optimization algorithm in deep learning, widely recognized for its flexibility with adaptive learning rates and efficiency in handling large-scale data. However, despite its practical success, the theoretical understanding of Adam’s convergence has been constrained by stringent assumptions, such as almost surely bounded stochastic gradients or uniformly bounded gradients, which are more restrictive than those typically required for analyzing stochastic gradient descent (SGD).
In this paper, we introduce a novel and comprehensive framework for analyzing the convergence properties of Adam. This framework offers a versatile approach to establishing Adam’s convergence. Specifically, we prove that Adam achieves asymptotic (last iterate sense) convergence in both the almost sure sense and the (L_1) sense under the relaxed assumptions typically used for SGD, namely (L)-smoothness and the ABC inequality. Meanwhile, under the same assumptions, we show that Adam attains non-asymptotic sample complexity bounds similar to those of SGD.
Submission history
From: Baoxiang Wang [view email]
[v1]
Sun, 6 Oct 2024 12:15:00 UTC (56 KB)
[v2]
Sat, 19 Oct 2024 09:33:12 UTC (62 KB)
Source link
lol