MetaMetrics: Calibrating Metrics For Generation Tasks Using Human Preferences

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled MetaMetrics: Calibrating Metrics For Generation Tasks Using Human Preferences, by Genta Indra Winata and 4 other authors

View PDF
HTML (experimental)

Abstract:Understanding the quality of a performance evaluation metric is crucial for ensuring that model outputs align with human preferences. However, it remains unclear how well each metric captures the diverse aspects of these preferences, as metrics often excel in one particular area but not across all dimensions. To address this, it is essential to systematically calibrate metrics to specific aspects of human preference, catering to the unique characteristics of each aspect. We introduce MetaMetrics, a calibrated meta-metric designed to evaluate generation tasks across different modalities in a supervised manner. MetaMetrics optimizes the combination of existing metrics to enhance their alignment with human preferences. Our metric demonstrates flexibility and effectiveness in both language and vision downstream tasks, showing significant benefits across various multilingual and multi-domain scenarios. MetaMetrics aligns closely with human preferences and is highly extendable and easily integrable into any application. This makes MetaMetrics a powerful tool for improving the evaluation of generation tasks, ensuring that metrics are more representative of human judgment across diverse contexts.

Submission history

From: Genta Indra Winata [view email]
[v1]
Thu, 3 Oct 2024 11:01:25 UTC (5,980 KB)
[v2]
Mon, 7 Oct 2024 16:39:24 UTC (6,485 KB)
[v3]
Thu, 28 Nov 2024 23:46:52 UTC (7,953 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.