AMPNet: Attention as Message Passing for Graph Neural Networks
Rizvi, Syed Asad, Nguyen, Nhi, Lyu, Haoran, Christensen, Benjamin, Caro, Josue Ortega, Fonseca, Antonio H. O., Zappala, Emanuele, Bagherian, Maryam, Averill, Christopher, Abdallah, Chadi G., Ying, Rex, Brbic, Maria, Dhodapkar, Rahul Madhav, van Dijk, David
–arXiv.org Artificial Intelligence
Graph Neural Networks (GNNs) have emerged as a powerful representation learning framework for graph-structured data. A key limitation of conventional GNNs is their representation of each node with a singular feature vector, potentially overlooking intricate details about individual node features. Here, we propose an Attention-based Message-Passing layer for GNNs (AMPNet) that encodes individual features per node and models feature-level interactions through cross-node attention during message-passing steps. We demonstrate the abilities of AMPNet through extensive benchmarking on real-world biological systems such as fMRI brain activity recordings and spatial genomic data, improving over existing baselines by 20% on fMRI signal reconstruction, and further improving another 8% with positional embedding added. Finally, we validate the ability of AMPNet to uncover meaningful feature-level interactions through case studies on biological systems. We anticipate that our architecture will be highly applicable to graph-structured data where node entities encompass rich feature-level information.
arXiv.org Artificial Intelligence
Oct-6-2023
- Country:
- Europe (0.46)
- North America > United States
- California (0.14)
- Massachusetts (0.14)
- Genre:
- Research Report (0.64)
- Industry:
- Health & Medicine
- Health Care Technology (1.00)
- Therapeutic Area > Neurology (1.00)
- Health & Medicine
- Technology: