HERO: Heterogeneous Continual Graph Learning via Meta-Knowledge Distillation
Sun, Guiquan, Zhang, Xikun, Ni, Jingchao, Song, Dongjin
–arXiv.org Artificial Intelligence
Heterogeneous graph neural networks have seen rapid progress in web applications such as social networks, knowledge graphs, and recommendation systems, driven by the inherent heterogeneity of web data. However, existing methods typically assume static graphs, while real-world graphs are continuously evolving. This dynamic nature requires models to adapt to new data while preserving existing knowledge. To this end, this work introduces HERO (HEterogeneous continual gRaph learning via meta-knOwledge distillation), a unified framework for continual learning on heterogeneous graphs. HERO employs meta-adaptation, a gradient-based meta-learning strategy that provides directional guidance for rapid adaptation to new tasks with limited samples. To enable efficient and effective knowledge reuse, we propose DiSCo (Diversity Sampling with semantic Consistency), a heterogeneity-aware sampling method that maximizes target node diversity and expands subgraphs along metapaths, retaining critical semantic and structural information with minimal overhead. Furthermore, HERO incorporates heterogeneity-aware knowledge distillation, which aligns knowledge at both the node and semantic levels to balance adaptation and retention across tasks. Extensive experiments on four web-related heterogeneous graph benchmarks demonstrate that HERO substantially mitigates catastrophic forgetting while achieving efficient and consistent knowledge reuse in dynamic web environments.
arXiv.org Artificial Intelligence
Oct-21-2025
- Country:
- Asia > Singapore (0.04)
- Europe > Greece (0.04)
- North America > United States
- Connecticut > Tolland County
- Storrs (0.14)
- Texas > Harris County
- Houston (0.14)
- Connecticut > Tolland County
- Genre:
- Research Report > New Finding (0.68)
- Industry:
- Information Technology (0.34)
- Technology: