Mind the Links: Cross-Layer Attention for Link Prediction in Multiplex Networks
Sharma, Devesh, Kishore, Aditya, Garg, Ayush, Mazumder, Debajyoti, Mohapatra, Debasis, Patro, Jasabanta
–arXiv.org Artificial Intelligence
Multiplex graphs capture diverse relations among shared nodes. Most predictors either collapse layers or treat them independently. This loses crucial inter-layer dependencies and struggles with scalability. To overcome this, we frame multiplex link prediction as multi-view edge classification. For each node pair, we construct a sequence of per-layer edge views and apply cross-layer self-attention to fuse evidence for the target layer. We present two models as instances of this framework: Trans-SLE, a lightweight transformer over static embeddings, and Trans-GAT, which combines layer-specific GAT encoders with transformer fusion. To ensure scalability and fairness, we introduce a Union--Set candidate pool and two leakage-free protocols: cross-layer and inductive subgraph generalization. Experiments on six public multiplex datasets show consistent macro-F_1 gains over strong baselines (MELL, HOPLP-MUL, RMNE). Our approach is simple, scalable, and compatible with both precomputed embeddings and GNN encoders.
arXiv.org Artificial Intelligence
Sep-30-2025
- Country:
- Asia > India
- Madhya Pradesh > Bhopal (0.04)
- Europe > Middle East
- Asia > India
- Genre:
- Research Report (0.64)
- Technology:
- Information Technology
- Artificial Intelligence > Machine Learning (1.00)
- Communications (0.94)
- Data Science > Data Mining (0.89)
- Information Management > Search (0.65)
- Information Technology