Goto

Collaborating Authors

 hypergraph laplacian




Sheaf Hypergraph Networks

Neural Information Processing Systems

Higher-order relations are widespread in nature, with numerous phenomena involving complex interactions that extend beyond simple pairwise connections.



Reviews: HyperGCN: A New Method For Training Graph Convolutional Networks on Hypergraphs

Neural Information Processing Systems

The relationships of many real-world networks are complex and go beyond pairwise associations. Hypergraphs provide a flexible and natural modeling tool to model such complex relationships. The authors propose HyperGCN, a novel way of training a GCN for semi-supervised learning on hypergraphs using tools from spectral theory of hypergraphs and introduce FastHyperGCN. They conduct some experiments on co-authorship and co-citation hypergraphs to demonstrate the effectiveness of HyperGCN, and provide theoretical analyses for the results. The paper proposes 1-HyperGCN and HyperGCN using the hypergraph Laplacian and the generalized hypergraph Laplacian with mediators.


Hypergraph Laplacian Eigenmaps and Face Recognition Problems

Tran, Loc Hoang

arXiv.org Artificial Intelligence

Abstract: Face recognition is a very important topic in data science and biometric security research areas. It has multiple applications in military, finance, and retail, to name a few. In this paper, the novel hypergraph Laplacian Eigenmaps will be proposed and combine with the k nearest-neighbor method and/or with the kernel ridge regression method to solve the face recognition problem. Experimental results illustrate that the accuracy of the combination of the novel hypergraph Laplacian Eigenmaps and one specific classification system is similar to the accuracy of the combination of the old symmetric normalized hypergraph Laplacian Eigenmaps method and one specific classification system. Keywords: face recognition, hypergraph, Laplacian Eigenmaps, classification I. Introduction Given a relational dataset, the pairwise relationships among objects/entities/samples in this dataset can be represented as the weighted graph.


HyperMagNet: A Magnetic Laplacian based Hypergraph Neural Network

Benko, Tatyana, Buck, Martin, Amburg, Ilya, Young, Stephen J., Aksoy, Sinan G.

arXiv.org Artificial Intelligence

In data science, hypergraphs are natural models for data exhibiting multi-way relations, whereas graphs only capture pairwise. Nonetheless, many proposed hypergraph neural networks effectively reduce hypergraphs to undirected graphs via symmetrized matrix representations, potentially losing important information. We propose an alternative approach to hypergraph neural networks in which the hypergraph is represented as a non-reversible Markov chain. We use this Markov chain to construct a complex Hermitian Laplacian matrix - the magnetic Laplacian - which serves as the input to our proposed hypergraph neural network. We study HyperMagNet for the task of node classification, and demonstrate its effectiveness over graph-reduction based hypergraph neural networks.


Sheaf Hypergraph Networks

Duta, Iulia, Cassarà, Giulia, Silvestri, Fabrizio, Liò, Pietro

arXiv.org Artificial Intelligence

Higher-order relations are widespread in nature, with numerous phenomena involving complex interactions that extend beyond simple pairwise connections. As a result, advancements in higher-order processing can accelerate the growth of various fields requiring structured data. Current approaches typically represent these interactions using hypergraphs. We enhance this representation by introducing cellular sheaves for hypergraphs, a mathematical construction that adds extra structure to the conventional hypergraph while maintaining their local, higherorder connectivity. Drawing inspiration from existing Laplacians in the literature, we develop two unique formulations of sheaf hypergraph Laplacians: linear and non-linear. Our theoretical analysis demonstrates that incorporating sheaves into the hypergraph Laplacian provides a more expressive inductive bias than standard hypergraph diffusion, creating a powerful instrument for effectively modelling complex data structures. We employ these sheaf hypergraph Laplacians to design two categories of models: Sheaf Hypergraph Neural Networks and Sheaf Hypergraph Convolutional Networks. These models generalize classical Hypergraph Networks often found in the literature. Through extensive experimentation, we show that this generalization significantly improves performance, achieving top results on multiple benchmark datasets for hypergraph node classification.


Central-Smoothing Hypergraph Neural Networks for Predicting Drug-Drug Interactions

Nguyen, Duc Anh, Nguyen, Canh Hao, Mamitsuka, Hiroshi

arXiv.org Artificial Intelligence

Predicting drug-drug interactions (DDI) is the problem of predicting side effects (unwanted outcomes) of a pair of drugs using drug information and known side effects of many pairs. This problem can be formulated as predicting labels (i.e. side effects) for each pair of nodes in a DDI graph, of which nodes are drugs and edges are interacting drugs with known labels. State-of-the-art methods for this problem are graph neural networks (GNNs), which leverage neighborhood information in the graph to learn node representations. For DDI, however, there are many labels with complicated relationships due to the nature of side effects. Usual GNNs often fix labels as one-hot vectors that do not reflect label relationships and potentially do not obtain the highest performance in the difficult cases of infrequent labels. In this paper, we formulate DDI as a hypergraph where each hyperedge is a triple: two nodes for drugs and one node for a label. We then present CentSmoothie, a hypergraph neural network that learns representations of nodes and labels altogether with a novel central-smoothing formulation. We empirically demonstrate the performance advantages of CentSmoothie in simulations as well as real datasets.


Exact Inference in High-order Structured Prediction

Ke, Chuyang, Honorio, Jean

arXiv.org Machine Learning

Structured prediction has been widely used in various machine learning fields in the past 20 years, including applications like social network analysis, computer vision, molecular biology, natural language processing (NLP), among others. A common objective in these tasks is assigning / recovering labels, that is, given some possibly noisy observation, the goal is to output a group label for each entity in the task. In social network analysis, this could be detecting communities based on user profiles and preferences [Kelley et al., 2012]. In computer vision, researchers want the AI to decide whether a pixel is in the foreground or background [Nowozin et al., 2011]. In biology, it is sometimes desirable to cluster molecules by structural similarity [Nugent and Meila, 2010].