g-meta
412604be30f701b1b1e3124c252065e6-AuthorFeedback.pdf
We thankR4 for rightly pointing out that local subgraphs can alleviate over-smoothing50 because, ineach iteration, different subgraphs arefedintoGNN, which promotes inductivegeneralization.(5.4)R451 raised a question about hyper-parameters. We use random search on validation set to select hyper-parameters and52 findthatmodel performance isstable forabroad range ofvalues. Wefollowstandardepisode54 training and semi-supervised setting inwhich most nodes are not labeled, i.e., few-shot learning.
Graph Meta Learning via Local Subgraphs
Prevailing methods for graphs require abundant label and edge information for learning. When data for a new task are scarce, meta-learning can learn from prior experiences and form much-needed inductive biases for fast adaption to new tasks. Here, we introduce G-Meta, a novel meta-learning algorithm for graphs. G-Meta uses local subgraphs to transfer subgraph-specific information and learn transferable knowledge faster via meta gradients. G-Meta learns how to quickly adapt to a new task using only a handful of nodes or edges in the new task and does so by learning from data points in other graphs or related, albeit disjoint label sets. G-Meta is theoretically justified as we show that the evidence for a prediction can be found in the local subgraph surrounding the target node or edge. Experiments on seven datasets and nine baseline methods show that G-Meta outperforms existing methods by up to 16.3%. Unlike previous methods, G-Meta successfully learns in challenging, few-shot learning settings that require generalization to completely new graphs and never-before-seen labels. Finally, G-Meta scales to large graphs, which we demonstrate on a new Tree-of-Life dataset comprising of 1,840 graphs, a two-orders of magnitude increase in the number of graphs used in prior work.
Graph Meta Learning via Local Subgraphs
Prevailing methods for graphs require abundant label and edge information for learning. When data for a new task are scarce, meta-learning can learn from prior experiences and form much-needed inductive biases for fast adaption to new tasks. Here, we introduce G-Meta, a novel meta-learning algorithm for graphs. G-Meta uses local subgraphs to transfer subgraph-specific information and learn transferable knowledge faster via meta gradients. G-Meta learns how to quickly adapt to a new task using only a handful of nodes or edges in the new task and does so by learning from data points in other graphs or related, albeit disjoint label sets.
G-Meta: Distributed Meta Learning in GPU Clusters for Large-Scale Recommender Systems
Xiao, Youshao, Zhao, Shangchun, Zhou, Zhenglei, Huan, Zhaoxin, Ju, Lin, Zhang, Xiaolu, Wang, Lin, Zhou, Jun
Recently, a new paradigm, meta learning, has been widely applied to Deep Learning Recommendation Models (DLRM) and significantly improves statistical performance, especially in cold-start scenarios. However, the existing systems are not tailored for meta learning based DLRM models and have critical problems regarding efficiency in distributed training in the GPU cluster. It is because the conventional deep learning pipeline is not optimized for two task-specific datasets and two update loops in meta learning. This paper provides a high-performance framework for large-scale training for Optimization-based Meta DLRM models over the \textbf{G}PU cluster, namely \textbf{G}-Meta. Firstly, G-Meta utilizes both data parallelism and model parallelism with careful orchestration regarding computation and communication efficiency, to enable high-speed distributed training. Secondly, it proposes a Meta-IO pipeline for efficient data ingestion to alleviate the I/O bottleneck. Various experimental results show that G-Meta achieves notable training speed without loss of statistical performance. Since early 2022, G-Meta has been deployed in Alipay's core advertising and recommender system, shrinking the continuous delivery of models by four times. It also obtains 6.48\% improvement in Conversion Rate (CVR) and 1.06\% increase in CPM (Cost Per Mille) in Alipay's homepage display advertising, with the benefit of larger training samples and tasks.
- Asia > China > Zhejiang Province > Hangzhou (0.06)
- Europe > United Kingdom > England > West Midlands > Birmingham (0.05)
- North America > United States > New York > New York County > New York City (0.04)
- North America > United States > Georgia > Chatham County > Savannah (0.04)