KIF: Knowledge Identification and Fusion for Language Model Continual Learning

LLM For Structured Data


View a PDF of the paper titled KIF: Knowledge Identification and Fusion for Language Model Continual Learning, by Yujie Feng and 6 other authors

View PDF
HTML (experimental)

Abstract:Language model continual learning (CL) has recently attracted significant interest for its ability to adapt large language models (LLMs) to dynamic real-world scenarios without retraining. A major challenge in this domain is catastrophic forgetting, where models lose previously acquired knowledge upon learning new tasks. Existing approaches commonly utilize multiple parameter-efficient fine-tuning (PEFT) blocks to acquire task-specific knowledge, yet these methods are inefficient and fail to leverage potential knowledge transfer across tasks. In this paper, we introduce a novel CL framework for language models, named Knowledge Identification and Fusion (KIF), which boosts knowledge transfer without depending on memory replay. KIF initially segregates the model into ‘skill units’ based on parameter dependencies, allowing for more precise control. Subsequently, it employs a novel group-wise knowledge identification technique to ascertain the importance distribution of skill units for a new task. By comparing this importance distribution with those from previous tasks, we implement a fine-grained knowledge fusion strategy that retains task-specific knowledge, thereby preventing forgetting, and updates task-shared knowledge, which facilitates bi-directional knowledge transfer. As a result, KIF achieves an optimal balance between retaining prior knowledge and excelling in new tasks. KIF also demonstrates strong generalizability, making it suitable for various base models and adaptable to PEFT methods like LoRA. Furthermore, it offers notable extensibility, supporting enhancements through integration with memory replay techniques. Comprehensive experiments conducted on two CL benchmarks, involving models ranging from 220M to 7B parameters, affirm the effectiveness of KIF and its variants across different settings.

Submission history

From: Yujie Feng [view email]
[v1]
Fri, 9 Aug 2024 17:44:45 UTC (9,325 KB)
[v2]
Fri, 30 Aug 2024 11:14:17 UTC (11,178 KB)
[v3]
Wed, 18 Dec 2024 12:07:27 UTC (11,275 KB)
[v4]
Thu, 23 Jan 2025 12:06:37 UTC (11,248 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.