View a PDF of the paper titled When Meta-Learning Meets Online and Continual Learning: A Survey, by Jaehyeon Son and 2 other authors
Abstract:Over the past decade, deep neural networks have demonstrated significant success using the training scheme that involves mini-batch stochastic gradient descent on extensive datasets. Expanding upon this accomplishment, there has been a surge in research exploring the application of neural networks in other learning scenarios. One notable framework that has garnered significant attention is meta-learning. Often described as “learning to learn,” meta-learning is a data-driven approach to optimize the learning algorithm. Other branches of interest are continual learning and online learning, both of which involve incrementally updating a model with streaming data. While these frameworks were initially developed independently, recent works have started investigating their combinations, proposing novel problem settings and learning algorithms. However, due to the elevated complexity and lack of unified terminology, discerning differences between the learning frameworks can be challenging even for experienced researchers. To facilitate a clear understanding, this paper provides a comprehensive survey that organizes various problem settings using consistent terminology and formal descriptions. By offering an overview of these learning paradigms, our work aims to foster further advancements in this promising area of research.
Submission history
From: Jaehyeon Son [view email]
[v1]
Thu, 9 Nov 2023 09:49:50 UTC (5,249 KB)
[v2]
Fri, 26 Jul 2024 09:39:01 UTC (7,277 KB)
[v3]
Fri, 8 Nov 2024 02:36:57 UTC (2,895 KB)
Source link
lol