Fundamental Limits of Prompt Compression: A Rate-Distortion Framework for Black-Box Language Models

Every’s Master Plan


View a PDF of the paper titled Fundamental Limits of Prompt Compression: A Rate-Distortion Framework for Black-Box Language Models, by Alliot Nagle and 5 other authors

View PDF
HTML (experimental)

Abstract:We formalize the problem of prompt compression for large language models (LLMs) and present a framework to unify token-level prompt compression methods which create hard prompts for black-box models. We derive the distortion-rate function for this setup as a linear program, and provide an efficient algorithm to compute this fundamental limit via the dual of the linear program. Using the distortion-rate function as the baseline, we study the performance of existing compression schemes on a synthetic dataset consisting of prompts generated from a Markov chain, natural language queries, and their respective answers. Our empirical analysis demonstrates the criticality of query-aware prompt compression, where the compressor has knowledge of the downstream task/query for the black-box LLM. We show that there is a large gap between the performance of current prompt compression methods and the optimal strategy, and propose Adaptive QuerySelect, a query-aware, variable-rate adaptation of a prior work to close the gap. We extend our experiments to a small natural language dataset to further confirm our findings on our synthetic dataset.

Submission history

From: Adway Girish [view email]
[v1]
Mon, 22 Jul 2024 09:40:13 UTC (2,367 KB)
[v2]
Wed, 11 Dec 2024 01:59:36 UTC (3,668 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.