View a PDF of the paper titled Conformal Prediction for Class-wise Coverage via Augmented Label Rank Calibration, by Yuanjie Shi and 3 other authors
Abstract:Conformal prediction (CP) is an emerging uncertainty quantification framework that allows us to construct a prediction set to cover the true label with a pre-specified marginal or conditional probability. Although the valid coverage guarantee has been extensively studied for classification problems, CP often produces large prediction sets which may not be practically useful. This issue is exacerbated for the setting of class-conditional coverage on imbalanced classification tasks with many and/or imbalanced classes. This paper proposes the Rank Calibrated Class-conditional CP (RC3P) algorithm to reduce the prediction set sizes to achieve class-conditional coverage, where the valid coverage holds for each class. In contrast to the standard class-conditional CP (CCP) method that uniformly thresholds the class-wise conformity score for each class, the augmented label rank calibration step allows RC3P to selectively iterate this class-wise thresholding subroutine only for a subset of classes whose class-wise top-k error is small. We prove that agnostic to the classifier and data distribution, RC3P achieves class-wise coverage. We also show that RC3P reduces the size of prediction sets compared to the CCP method. Comprehensive experiments on multiple real-world datasets demonstrate that RC3P achieves class-wise coverage and 26.25% reduction in prediction set sizes on average.
Submission history
From: Yuanjie Shi [view email]
[v1]
Mon, 10 Jun 2024 22:01:34 UTC (917 KB)
[v2]
Thu, 31 Oct 2024 02:32:07 UTC (1,013 KB)
[v3]
Sat, 16 Nov 2024 00:04:07 UTC (1,027 KB)
Source link
lol