random walk kernel
WeisfeilerandLemanGoWalking: RandomWalkKernelsRevisited
Technically,various methods of both categories exploit the link between graph data and linear algebra by representing graphs by their (normalized) adjacency matrix. Such methods are often defined or can be interpreted in terms ofwalks. On the other hand, the Weisfeiler-Leman heuristic for graph isomorphism testing has attracted great interest in machine learning [33, 34].
- Europe > Austria > Vienna (0.05)
- North America > United States > District of Columbia > Washington (0.04)
- Oceania > Australia > New South Wales > Sydney (0.04)
- (6 more...)
Random Walk Graph Neural Networks
In recent years, graph neural networks (GNNs) have become the de facto tool for performing machine learning tasks on graphs. Most GNNs belong to the family of message passing neural networks (MPNNs). These models employ an iterative neighborhood aggregation scheme to update vertex representations. Then, to compute vector representations of graphs, they aggregate the representations of the vertices using some permutation invariant function. One would expect the hidden layers of a GNN to be composed of parameters that take the form of graphs.
- Asia > Japan > Honshū > Kansai > Osaka Prefecture > Osaka (0.04)
- Europe > Switzerland > Zürich > Zürich (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
MoSE: Unveiling Structural Patterns in Graphs via Mixture of Subgraph Experts
Ye, Junda, Zhang, Zhongbao, Sun, Li, Luo, Siqiang
While graph neural networks (GNNs) have achieved great success in learning from graph-structured data, their reliance on local, pairwise message passing restricts their ability to capture complex, high-order subgraph patterns. leading to insufficient structural expressiveness. Recent efforts have attempted to enhance structural expressiveness by integrating random walk kernels into GNNs. However, these methods are inherently designed for graph-level tasks, which limits their applicability to other downstream tasks such as node classification. Moreover, their fixed kernel configurations hinder the model's flexibility in capturing diverse subgraph structures. To address these limitations, this paper proposes a novel Mixture of Subgraph Experts (MoSE) framework for flexible and expressive subgraph-based representation learning across diverse graph tasks. Specifically, MoSE extracts informative subgraphs via anonymous walks and dynamically routes them to specialized experts based on structural semantics, enabling the model to capture diverse subgraph patterns with improved flexibility and interpretability. We further provide a theoretical analysis of MoSE's expressivity within the Subgraph Weisfeiler-Lehman (SWL) Test, proving that it is more powerful than SWL. Extensive experiments, together with visualizations of learned subgraph experts, demonstrate that MoSE not only outperforms competitive baselines but also provides interpretable insights into structural patterns learned by the model.
- North America > United States > Wisconsin (0.05)
- North America > United States > Texas (0.04)
- Asia > Middle East > Jordan (0.04)
- (3 more...)
- Health & Medicine > Therapeutic Area > Oncology (0.93)
- Health & Medicine > Pharmaceuticals & Biotechnology (0.67)
- Europe > Austria > Vienna (0.14)
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (8 more...)
Random Walk Graph Neural Networks
In recent years, graph neural networks (GNNs) have become the de facto tool for performing machine learning tasks on graphs. Most GNNs belong to the family of message passing neural networks (MPNNs). These models employ an iterative neighborhood aggregation scheme to update vertex representations. Then, to compute vector representations of graphs, they aggregate the representations of the vertices using some permutation invariant function. One would expect the hidden layers of a GNN to be composed of parameters that take the form of graphs.
- Asia > Japan > Honshū > Kansai > Osaka Prefecture > Osaka (0.04)
- Europe > Switzerland > Zürich > Zürich (0.04)
- Europe > Switzerland > Basel-City > Basel (0.04)
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
Weisfeiler and Leman Go Walking: Random Walk Kernels Revisited
Random walk kernels have been introduced in seminal work on graph learning and were later largely superseded by kernels based on the Weisfeiler-Leman test for graph isomorphism. We give a unified view on both classes of graph kernels. We study walk-based node refinement methods and formally relate them to several widely-used techniques, including Morgan's algorithm for molecule canonization and the Weisfeiler-Leman test. We define corresponding walk-based kernels on nodes that allow fine-grained parameterized neighborhood comparison, reach Weisfeiler-Leman expressiveness, and are computed using the kernel trick. From this we show that classical random walk kernels with only minor modifications regarding definition and computation are as expressive as the widely-used Weisfeiler-Leman subtree kernel but support non-strict neighborhood comparison. We verify experimentally that walk-based kernels reach or even surpass the accuracy of Weisfeiler-Leman kernels in real-world classification tasks.
- Europe > Austria > Vienna (0.14)
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > District of Columbia > Washington (0.04)
- (8 more...)