ReFactor GNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective

AI Salary Survey: 2024/25 Results


View a PDF of the paper titled ReFactor GNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective, by Yihong Chen and 5 other authors

View PDF
HTML (experimental)

Abstract:Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactor GNNs. This new architecture draws upon both modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactor GNNs. Across a multitude of well-established KGC benchmarks, our ReFactor GNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.

Submission history

From: Yihong Chen [view email]
[v1]
Wed, 20 Jul 2022 15:39:30 UTC (4,368 KB)
[v2]
Thu, 21 Jul 2022 13:33:26 UTC (4,368 KB)
[v3]
Thu, 27 Oct 2022 17:53:29 UTC (5,446 KB)
[v4]
Thu, 16 Jan 2025 15:56:56 UTC (311 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.