Generation through the lens of learning theory

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled Generation through the lens of learning theory, by Jiaxun Li and 2 other authors

View PDF
HTML (experimental)

Abstract:We study generation through the lens of statistical learning theory. First, we abstract and formalize the results of Gold [1967], Angluin [1979], Angluin [1980] and Kleinberg and Mullainathan [2024] in terms of a binary hypothesis class defined over an abstract example space. Then, we extend the notion of “generation” from Kleinberg and Mullainathan [2024] to two new settings, we call “uniform” and “non-uniform” generation, and provide a characterization of which hypothesis classes are uniformly and non-uniformly generatable. As is standard in learning theory, our characterizations are in terms of the finiteness of a new combinatorial dimension termed the Closure dimension. By doing so, we are able to compare generatability with predictability (captured via PAC and online learnability) and show that these two properties of hypothesis classes are incompatible — there are classes that are generatable but not predictable and vice versa. Finally, we extend our results to capture prompted generation and give a complete characterization of which classes are prompt generatable, generalizing some of the work by Kleinberg and Mullainathan [2024].

Submission history

From: Vinod Raman [view email]
[v1]
Thu, 17 Oct 2024 16:14:49 UTC (40 KB)
[v2]
Mon, 21 Oct 2024 17:21:16 UTC (39 KB)
[v3]
Thu, 24 Oct 2024 14:46:54 UTC (38 KB)
[v4]
Thu, 21 Nov 2024 04:22:12 UTC (276 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.