Multi-Label Graph Convolutional Network Representation Learning

arXiv.org Machine Learning

--Knowledge representation of graph-based systems is fundamental across many disciplines. T o date, most existing methods for representation learning primarily focus on networks with simplex labels, yet real-world objects (nodes) are inherently complex in nature and often contain rich semantics or labels, e . The multi-label network nodes not only have multiple labels for each node, such labels are often highly correlated making existing methods ineffective or fail to handle such correlation for node representation learning. In this paper, we propose a novel multi-label graph convolutional network (ML-GCN) for learning node representation for multi-label networks. T o fully explore label-label correlation and network topology structures, we propose to model a multi-label network as two Siamese GCNs: a node-node-label graph and a label-label-node graph. The two GCNs each handle one aspect of representation learning for nodes and labels, respectively, and they are seamlessly integrated under one objective function. The learned label representations can effectively preserve the inner-label interaction and node label properties, and are then aggregated to enhance the node representation learning under a unified training framework. Experiments and comparisons on multi-label node classification validate the effectiveness of our proposed approach. Graphs have become increasingly common structures for organizing data in many complex systems such as sensor networks, citation networks, social networks and many more [1]. Such a development raised new requirement of efficient network representation or embedding learning algorithms for various real-world applications, which seeks to learn low-dimensional vector representations of all nodes with preserved graph topology structures, such as edge links, degrees, and communities etc.


Hyperbolic Interaction Model For Hierarchical Multi-Label Classification

arXiv.org Machine Learning

Different from the traditional classification tasks which assume mutual exclusion of labels, hierarchical multi-label classification (HMLC) aims to assign multiple labels to every instance with the labels organized under hierarchical relations. In fact, linguistic ontologies are intrinsic hierarchies. Besides the labels, the conceptual relations between words can also form hierarchical structures. Thus it can be a challenge to learn mappings from the word space to the label space, and vice versa. We propose to model the word and label hierarchies by embedding them jointly in the hyperbolic space. The main reason is that the tree-likeness of the hyperbolic space matches the complexity of symbolic data with hierarchical structures. A new hyperbolic interaction model (HyperIM) is designed to learn the label-aware document representations and make predictions for HMLC. Extensive experiments are conducted on three benchmark datasets. The results have demonstrated that the new model can realistically capture the complex data structures and further improve the performance for HMLC comparing with the state-of-the-art methods. To facilitate future research, our code is publicly available.


On Learning Vector Representations in Hierarchical Label Spaces

arXiv.org Machine Learning

An important problem in multi-label classification is to capture label patterns or underlying structures that have an impact on such patterns. This paper addresses one such problem, namely how to exploit hierarchical structures over labels. We present a novel method to learn vector representations of a label space given a hierarchy of labels and label co-occurrence patterns. Our experimental results demonstrate qualitatively that the proposed method is able to learn regularities among labels by exploiting a label hierarchy as well as label co-occurrences. It highlights the importance of the hierarchical information in order to obtain regularities which facilitate analogical reasoning over a label space. We also experimentally illustrate the dependency of the learned representations on the label hierarchy.


Non-intrusive Load Monitoring via Multi-label Sparse Representation based Classification

arXiv.org Machine Learning

This work follows the approach of multi - label classification for non - intrusive load monitoring (NILM) . We modify the popu lar sparse representation based classification (SRC) approach (developed for single label classification) to solve multi - label classification problems. Results on benchmark REDD and Pecan Street dataset shows significant improvement over state - of - the - art t echniques with small volume of training data . N non - intrusive load monitoring (NILM) the technical goal is to estimate the power consumption of different appliances given the aggregate smart - meter readings [1] . The broader social objective is to feedback this information to the household so that they can reduce power consumption and thereby save energy.


Learning Graph Representations with Embedding Propagation

Neural Information Processing Systems

We propose EP, Embedding Propagation, an unsupervised learning framework for graph-structured data. EP learns vector representations of graphs by passing two types of messages between neighboring nodes. Forward messages consist of label representations such as representations of words and other attributes associated with the nodes. Backward messages consist of gradients that result from aggregating the label representations and applying a reconstruction loss. Node representations are finally computed from the representation of their labels.