VibeCheck: Discover and Quantify Qualitative Differences in Large Language Models

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled VibeCheck: Discover and Quantify Qualitative Differences in Large Language Models, by Lisa Dunlap and 4 other authors

View PDF

Abstract:Large language models (LLMs) often exhibit subtle yet distinctive characteristics in their outputs that users intuitively recognize, but struggle to quantify. These “vibes” — such as tone, formatting, or writing style — influence user preferences, yet traditional evaluations focus primarily on the singular axis of correctness. We introduce VibeCheck, a system for automatically comparing a pair of LLMs by discovering identifying traits of a model (vibes) that are well-defined, differentiating, and user-aligned. VibeCheck iteratively discovers vibes from model outputs and then utilizes a panel of LLM judges to quantitatively measure the utility of each vibe. We validate that the vibes generated by VibeCheck align with those found in human discovery and run VibeCheck on pairwise preference data from real-world user conversations with Llama-3-70b vs GPT-4. VibeCheck reveals that Llama has a friendly, funny, and somewhat controversial vibe. These vibes predict model identity with 80% accuracy and human preference with 61% accuracy. Lastly, we run VibeCheck on a variety of models and tasks including summarization, math, and captioning to provide insight into differences in model behavior. VibeCheck discovers vibes like Command X prefers to add concrete intros and conclusions when summarizing in comparison to TNGL, Llama-405b often overexplains its thought process on math problems compared to GPT-4o, and GPT-4 prefers to focus on the mood and emotions of the scene when captioning compared to Gemini-1.5-Flash. Code can be found at this https URL

Submission history

From: Lisa Dunlap [view email]
[v1]
Thu, 10 Oct 2024 17:59:17 UTC (16,391 KB)
[v2]
Thu, 24 Oct 2024 20:01:12 UTC (16,385 KB)
[v3]
Mon, 28 Oct 2024 06:11:31 UTC (16,383 KB)
[v4]
Mon, 2 Dec 2024 20:27:39 UTC (11,154 KB)
[v5]
Fri, 13 Dec 2024 22:09:12 UTC (11,154 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.