View a PDF of the paper titled Differential learning kinetics govern the transition from memorization to generalization during in-context learning, by Alex Nguyen and 1 other authors
Abstract:Transformers exhibit in-context learning (ICL): the ability to use novel information presented in the context without additional weight updates. Recent work shows that ICL emerges when models are trained on a sufficiently diverse set of tasks and the transition from memorization to generalization is sharp with increasing task diversity. One interpretation is that a network’s limited capacity to memorize favors generalization. Here, we examine the mechanistic underpinnings of this transition using a small transformer applied to a synthetic ICL task. Using theory and experiment, we show that the sub-circuits that memorize and generalize can be viewed as largely independent. The relative rates at which these sub-circuits learn explains the transition from memorization to generalization, rather than capacity constraints. We uncover a memorization scaling law, which determines the task diversity threshold at which the network generalizes. The theory quantitatively explains a variety of other ICL-related phenomena, including the long-tailed distribution of when ICL is acquired, the bimodal behavior of solutions close to the task diversity threshold, the influence of contextual and data distributional statistics on ICL, and the transient nature of ICL.
Submission history
From: Gautam Reddy [view email]
[v1]
Wed, 27 Nov 2024 22:12:29 UTC (9,016 KB)
[v2]
Thu, 12 Dec 2024 16:10:51 UTC (9,024 KB)
Source link
lol