View a PDF of the paper titled GS-KGC: A Generative Subgraph-based Framework for Knowledge Graph Completion with Large Language Models, by Rui Yang and Jiahao Zhu and Jianping Man and Hongze Liu and Li Fang and Yi Zhou
Abstract:Knowledge graph completion (KGC) focuses on identifying missing triples in a knowledge graph (KG) , which is crucial for many downstream applications. Given the rapid development of large language models (LLMs), some LLM-based methods are proposed for KGC task. However, most of them focus on prompt engineering while overlooking the fact that finer-grained subgraph information can aid LLMs in generating more accurate answers. In this paper, we propose a novel completion framework called textbf{G}enerative textbf{S}ubgraph-based KGC (GS-KGC), which utilizes subgraph information as contextual reasoning and employs a QA approach to achieve the KGC task. This framework primarily includes a subgraph partitioning algorithm designed to generate negatives and neighbors. Specifically, negatives can encourage LLMs to generate a broader range of answers, while neighbors provide additional contextual insights for LLM reasoning. Furthermore, we found that GS-KGC can discover potential triples within the KGs and new facts beyond the KGs. Experiments conducted on four common KGC datasets highlight the advantages of the proposed GS-KGC, e.g., it shows a 5.6% increase in Hits@3 compared to the LLM-based model CP-KGC on the FB15k-237N, and a 9.3% increase over the LLM-based model TECHS on the ICEWS14.
Submission history
From: Rui Yang [view email]
[v1]
Tue, 20 Aug 2024 13:13:41 UTC (523 KB)
[v2]
Fri, 3 Jan 2025 04:12:32 UTC (799 KB)
Source link
lol