Stop Thinking, Just Do!

Sungsoo Kim's Blog

A General Graph Neural Network Framework for Link Prediction

tagsTags

30 November 2023


Article Source


A General Graph Neural Network Framework for Link Prediction

Abstract

Neural Bellman-Ford Networks - A brand-new representation learning framework based on paths for link prediction:

A. representation of a pair of nodes as the generalized sum of all path representations between the nodes,

B. with each path representation as the generalized product of the edge representations in the path.

Graph Neural Networks (GNNs), Machine Learning, Deep Learning, AI.

Single document (credits to): “Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction” by Zhaocheng Zhu, Zuobai Zhang, Louis-Pascal Xhonneux, Jian Tang arXiv:2106.06935v4 (Jan 2022)

Watch my first try to understand the Neural BFNet for Link Prediction, the Operator swapping for NN and Generalized BF iterations with t hops for top-of-the-line performance.

Either for homogeneous Graphs or Knowledge Graphs.

All mistakes in my understanding of NBFNet here are mine alone. I had only this arxiv preprint, no other information source. But GNN are a fascinating research topic, nobody should feel excluded.

Graph neural networks are representation learning models that encode topological structures of graphs.

Some link prediction methods adopt graph neural networks (GNNs) to automatically extract important features from local neighborhoods for link prediction.

For link prediction, the frameworks adopt an auto-encoder formulation, which uses GNNs to encode node representations, and decodes edges as a function over node pairs.


comments powered by Disqus