View a PDF of the paper titled Neuron Patching: Semantic-based Neuron-level Language Model Repair for Code Generation, by Jian Gu and 3 other authors
Abstract:Language Models (LMs) have become widely used in software engineering, especially for tasks such as code generation, where they are referred to as code LMs. These models have proven effective in generating code, making it easier for developers to automate coding activities. However, research has highlighted a significant limitation: despite their effectiveness, LMs often produce code that is incorrect, buggy, or not fully functional. Updating these models with limited data can be prohibitively challenging, yet it is essential to maximize their utility. This may require hot-fix techniques (updating models with limited data) to resolve. In this paper, we propose ul{M}odel ul{I}mprovement via ul{N}euron ul{T}argeting (textsc{MINT}), a novel approach for repairing code LMs. MINT leverages the semantic property of language models to perform neuron-level repairs in a novel way. Further, by analyzing the relationships between the model’s latent representations, the incorrect outputs, and the desired outputs, textsc{MINT} determines which neurons are worth updating. This approach ensures that only the neurons crucial to the model’s failure are targeted, avoiding unnecessary changes and allowing for a more efficient and precise repair process. textsc{MINT} is effective, efficient, and reliable, capable of correcting a neural model by patching a minimum number of neurons (usually one or two neurons). Our approach is evaluated on three coding tasks: line-level code generation, shellcode generation, and intent-to-bash translation. The experimental results demonstrate that the proposed approach significantly outperforms the state-of-the-art in both effectiveness and efficiency measures. In addition, we analyze and discuss the side effects of model repair techniques, including the balance between generalization and specificity, and the performance after multiple repairs in succession.
Submission history
From: Jian Gu [view email]
[v1]
Fri, 8 Dec 2023 20:28:08 UTC (1,796 KB)
[v2]
Fri, 2 Feb 2024 04:31:00 UTC (1,535 KB)
[v3]
Mon, 15 Apr 2024 07:31:00 UTC (1,856 KB)
[v4]
Tue, 6 Aug 2024 03:57:33 UTC (1,955 KB)
[v5]
Wed, 20 Nov 2024 14:22:06 UTC (2,064 KB)
Source link
lol