Rethinking Message Passing Neural Networks with Diffusion Distance-guided Stress Majorization
Zheng, Haoran, Yang, Renchi, Zhou, Yubo, Xu, Jianliang
–arXiv.org Artificial Intelligence
Message passing neural networks (MPNNs) have emerged as go-to models for learning on graph-structured data in the past decade. Despite their effectiveness, most of such models still incur severe issues such as over-smoothing and -correlation, due to their underlying objective of minimizing the Dirichlet energy and the derived neighborhood aggregation operations. In this paper, we propose the DDSM, a new MPNN model built on an optimization framework that includes the stress majorization and orthogonal regularization for overcoming the above issues. Further, we introduce the diffusion distances for nodes into the framework to guide the new message passing operations and develop efficient algorithms for distance approximations, both backed by rigorous theoretical analyses. Our comprehensive experiments showcase that DDSM consistently and considerably outperforms 15 strong baselines on both homophilic and heterophilic graphs.
arXiv.org Artificial Intelligence
Nov-26-2025
- Country:
- Asia
- China > Hong Kong (0.04)
- Middle East > Israel (0.04)
- Myanmar > Tanintharyi Region
- Dawei (0.04)
- North America > United States
- District of Columbia > Washington (0.05)
- Michigan > Washtenaw County
- Ann Arbor (0.14)
- New York > New York County
- New York City (0.04)
- Texas (0.05)
- Wisconsin (0.05)
- Asia
- Genre:
- Instructional Material (0.46)
- Overview (0.46)
- Research Report (0.64)
- Technology: