Architectures of Topological Deep Learning: A Survey on Topological Neural Networks

Papillon, Mathilde, Sanborn, Sophia, Hajij, Mustafa, Miolane, Nina

arXiv.org Artificial Intelligence 

Many natural systems as diverse as social networks (Knoke and Yang, 2019) and proteins (Jha et al., 2022) are characterized by relational structure. This is the structure of interactions between components in the system, such as social interactions between individuals or electrostatic interactions between atoms. In Geometric Deep Learning (Bronstein et al., 2021), Graph Neural Networks (GNNs) (Zhou et al., 2020) have demonstrated remarkable achievements in processing relational data using graphs--mathematical objects commonly used to encode pairwise relations. However, the pairwise structure of graphs is limiting. Social interactions can involve more than two individuals, and electrostatic interactions more than two atoms. Topological Deep Learning (TDL) (Hajij et al., 2023; Bodnar, 2022) leverages more general abstractions to process data with higher-order relational structure. The theoretical guarantees (Bodnar et al., 2021a,b; Huang and Yang, 2021) of its models, Topological Neural Networks (TNNs), lead to state-of-the-art performance on many machine learning tasks (Dong et al., 2020; Hajij et al., 2022a; Barbarossa and Sardellitti, 2020; Chen et al., 2022)--and reveal high potential for the applied sciences and beyond. However, the abstraction and fragmentation of mathematical notation across the TDL literature significantly limits the field's accessibility, while complicating model comparison and obscuring opportunities for innovation. To address this, we present an intuitive and systematic comparison of published TNN architectures.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found