Goto

Collaborating Authors

 refactorgnn


ReFactorGNNs

Neural Information Processing Systems

In this section, we prove Theorem 1, which we restate here for convenience. Note that the component " " (highlighted in red) in Equation (18) is a sum " (highlighted in blue) is a term that contains State-of-the-art FMs are often trained with training strategies adapted for each model category. In general, we can interpret any auxiliary variable introduced by the optimizer (e.g. the velocity) as However, the specific equations would depend on the optimizer's dynamics and would be hard to The two main design choices in Theorem A.1 are 1) the score function In the paper, we chose DistMult and GD because of their mathematical simplicity, leading to easier-to-read formulas. In this paper, we describe the results on FB15K237_v1_ind under some random seed. One implementation for such evaluation can be found in GraIL's codebase.



ReFactor GNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective

Chen, Yihong, Mishra, Pushkar, Franceschi, Luca, Minervini, Pasquale, Stenetorp, Pontus, Riedel, Sebastian

arXiv.org Artificial Intelligence

Factorisation-based Models (FMs), such as DistMult, have enjoyed enduring success for Knowledge Graph Completion (KGC) tasks, often outperforming Graph Neural Networks (GNNs). However, unlike GNNs, FMs struggle to incorporate node features and generalise to unseen nodes in inductive settings. Our work bridges the gap between FMs and GNNs by proposing ReFactor GNNs. This new architecture draws upon both modelling paradigms, which previously were largely thought of as disjoint. Concretely, using a message-passing formalism, we show how FMs can be cast as GNNs by reformulating the gradient descent procedure as message-passing operations, which forms the basis of our ReFactor GNNs. Across a multitude of well-established KGC benchmarks, our ReFactor GNNs achieve comparable transductive performance to FMs, and state-of-the-art inductive performance while using an order of magnitude fewer parameters.