View a PDF of the paper titled Generalization Error of the Tilted Empirical Risk, by Gholamali Aminian and 5 other authors
Abstract:The generalization error (risk) of a supervised statistical learning algorithm quantifies its prediction ability on previously unseen data. Inspired by exponential tilting, Li et al. (2021) proposed the tilted empirical risk as a non-linear risk metric for machine learning applications such as classification and regression problems. In this work, we examine the generalization error of the tilted empirical risk. In particular, we provide uniform and information-theoretic bounds on the tilted generalization error, defined as the difference between the population risk and the tilted empirical risk, with a convergence rate of $O(1/sqrt{n})$ where $n$ is the number of training samples. Furthermore, we study the solution to the KL-regularized expected tilted empirical risk minimization problem and derive an upper bound on the expected tilted generalization error with a convergence rate of $O(1/n)$.
Submission history
From: Gholamali Aminian [view email]
[v1]
Sat, 28 Sep 2024 18:31:51 UTC (79 KB)
[v2]
Thu, 17 Oct 2024 12:23:07 UTC (88 KB)
Source link
lol