View a PDF of the paper titled Quest: Query-centric Data Synthesis Approach for Long-context Scaling of Large Language Model, by Chaochen Gao and 3 other authors
Abstract:Recent advancements in large language models (LLMs) have highlighted the importance of extending context lengths for handling complex tasks. While traditional methods for training on long contexts often use filtered long documents, these approaches lead to domain imbalances, limiting model performance. To address this, techniques like random document concatenation (Standard) and similarity-based methods (KNN, ICLM) have been developed. However, they either sacrifice semantic coherence or diversity. To balance both aspects, we introduce Quest, a query-centric data synthesis method aggregating semantically relevant yet diverse documents. Quest uses a generative model to predict potential queries for each document, grouping documents with similar queries and keywords. Extensive experiments demonstrate Quest’s superior performance on long-context tasks, achieving remarkable results with context lengths of up to 1M tokens and confirming its scalability across various model sizes.
Submission history
From: Chaochen Gao [view email]
[v1]
Thu, 30 May 2024 08:50:55 UTC (4,291 KB)
[v2]
Thu, 20 Jun 2024 02:26:48 UTC (4,596 KB)
[v3]
Sat, 14 Sep 2024 11:57:54 UTC (4,726 KB)
[v4]
Tue, 24 Sep 2024 09:06:21 UTC (4,736 KB)
[v5]
Wed, 9 Oct 2024 12:14:22 UTC (6,225 KB)
Source link
lol