On Provable Length and Compositional Generalization

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled On Provable Length and Compositional Generalization, by Kartik Ahuja and 1 other authors

View PDF
HTML (experimental)

Abstract:Out-of-distribution generalization capabilities of sequence-to-sequence models can be studied from the lens of two crucial forms of generalization: length generalization — the ability to generalize to longer sequences than ones seen during training, and compositional generalization: the ability to generalize to token combinations not seen during training. In this work, we provide first provable guarantees on length and compositional generalization for common sequence-to-sequence models — deep sets, transformers, state space models, and recurrent neural nets — trained to minimize the prediction error. Taking a first principles perspective, we study the realizable case, i.e., the labeling function is realizable on the architecture. We show that emph{simple limited capacity} versions of these different architectures achieve both length and compositional generalization. In all our results across different architectures, we find that the learned representations are linearly related to the representations generated by the true labeling function.

Submission history

From: Kartik Ahuja [view email]
[v1]
Wed, 7 Feb 2024 14:16:28 UTC (310 KB)
[v2]
Sat, 24 Feb 2024 15:28:51 UTC (313 KB)
[v3]
Fri, 7 Jun 2024 20:25:05 UTC (2,272 KB)
[v4]
Mon, 11 Nov 2024 09:22:02 UTC (4,488 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.