PACE: Marrying generalization in PArameter-efficient fine-tuning with Consistency rEgularization

AI Salary Survey: 2024/25 Results


View a PDF of the paper titled PACE: Marrying generalization in PArameter-efficient fine-tuning with Consistency rEgularization, by Yao Ni and 2 other authors

View PDF
HTML (experimental)

Abstract:Parameter-Efficient Fine-Tuning (PEFT) effectively adapts pre-trained transformers to downstream tasks. However, the optimization of tasks performance often comes at the cost of generalizability in fine-tuned models. To address this issue, we theoretically connect smaller weight gradient norms during training and larger datasets to the improvements in model generalization. Motivated by this connection, we propose reducing gradient norms for enhanced generalization and aligning fine-tuned model with the pre-trained counterpart to retain knowledge from large-scale pre-training data. Yet, naive alignment does not guarantee gradient reduction and can potentially cause gradient explosion, complicating efforts to manage gradients. To address such an issue, we propose PACE, marrying generalization of PArameter-efficient fine-tuning with Consistency rEgularization. We perturb features learned from the adapter with the multiplicative noise and ensure the fine-tuned model remains consistent for same sample under different perturbations. Theoretical analysis shows that PACE not only implicitly regularizes gradients for enhanced generalization, but also implicitly aligns the fine-tuned and pre-trained models to retain knowledge. Experimental evidence supports our theories. PACE surpasses existing PEFT methods in visual adaptation tasks (VTAB-1k, FGVC, few-shot learning, domain adaptation) showcasing its potential for resource-efficient fine-tuning. It also improves LoRA in text classification (GLUE) and mathematical reasoning (GSM-8K). The code is available at this https URL

Submission history

From: Yao Ni [view email]
[v1]
Wed, 25 Sep 2024 17:56:00 UTC (416 KB)
[v2]
Mon, 7 Oct 2024 01:00:46 UTC (418 KB)
[v3]
Sat, 2 Nov 2024 03:27:12 UTC (482 KB)
[v4]
Wed, 15 Jan 2025 16:56:26 UTC (484 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.