Appendix for Multi-task Graph Neural Architecture Search with Task-aware Collaboration and Curriculum
–Neural Information Processing Systems
An operation w Model weight α The architecture parameter N The number of chunks θ The trainable parameter in the soft task-collaborative module p The parameter generated by Eq.(9) p The parameter generated by Eq.(11), replacing p during curriculum training δ The parameter to control graph structure diversity γ The parameter to control task-wise curriculum training BNRist is the abbreviation of Beijing National Research Center for Information Science and Technology. Here we provide the detailed derivation process of Eq.(10). Then we use Eq.(9) to substitute We consider a search space of standard layer-by-layer architectures without sophisticated connections such as residual or jumping connections, though our proposed method can be easily generalized. We choose five widely used message-passing GNN layers as our operation candidate set O, including GCN [4], GAT [9], GIN [10], SAGE [2], k-GNN [5], and ARMA [3]. Besides, we also adopt MLP, which does not consider graph structures.
Neural Information Processing Systems
Mar-23-2025, 01:10:08 GMT
- Technology: