Livesay, Neal
TopoX: A Suite of Python Packages for Machine Learning on Topological Domains
Hajij, Mustafa, Papillon, Mathilde, Frantzen, Florian, Agerberg, Jens, AlJabea, Ibrahem, Ballester, Ruben, Battiloro, Claudio, Bernárdez, Guillermo, Birdal, Tolga, Brent, Aiden, Chin, Peter, Escalera, Sergio, Fiorellino, Simone, Gardaa, Odin Hoff, Gopalakrishnan, Gurusankar, Govil, Devendra, Hoppe, Josef, Karri, Maneel Reddy, Khouja, Jude, Lecha, Manuel, Livesay, Neal, Meißner, Jan, Mukherjee, Soham, Nikitin, Alexander, Papamarkou, Theodore, Prílepok, Jaro, Ramamurthy, Karthikeyan Natesan, Rosen, Paul, Guzmán-Sáenz, Aldo, Salatiello, Alessandro, Samaga, Shreyas N., Scardapane, Simone, Schaub, Michael T., Scofano, Luca, Spinelli, Indro, Telyatnikov, Lev, Truong, Quang, Walters, Robin, Yang, Maosheng, Zaghen, Olga, Zamzmi, Ghada, Zia, Ali, Miolane, Nina
We introduce TopoX, a Python software suite that provides reliable and user-friendly building blocks for computing and machine learning on topological domains that extend graphs: hypergraphs, simplicial, cellular, path and combinatorial complexes. TopoX consists of three packages: TopoNetX facilitates constructing and computing on these domains, including working with nodes, edges and higher-order cells; TopoEmbedX provides methods to embed topological domains into vector spaces, akin to popular graph-based embedding algorithms such as node2vec; TopoModelX is built on top of PyTorch and offers a comprehensive toolbox of higher-order message passing functions for neural networks on topological domains. The extensively documented and unit-tested source code of TopoX is available under MIT license at https://github.com/pyt-team.
ICML 2023 Topological Deep Learning Challenge : Design and Results
Papillon, Mathilde, Hajij, Mustafa, Jenne, Helen, Mathe, Johan, Myers, Audun, Papamarkou, Theodore, Birdal, Tolga, Dey, Tamal, Doster, Tim, Emerson, Tegan, Gopalakrishnan, Gurusankar, Govil, Devendra, Guzmán-Sáenz, Aldo, Kvinge, Henry, Livesay, Neal, Mukherjee, Soham, Samaga, Shreyas N., Ramamurthy, Karthikeyan Natesan, Karri, Maneel Reddy, Rosen, Paul, Sanborn, Sophia, Walters, Robin, Agerberg, Jens, Barikbin, Sadrodin, Battiloro, Claudio, Bazhenov, Gleb, Bernardez, Guillermo, Brent, Aiden, Escalera, Sergio, Fiorellino, Simone, Gavrilev, Dmitrii, Hassanin, Mohammed, Häusner, Paul, Gardaa, Odin Hoff, Khamis, Abdelwahed, Lecha, Manuel, Magai, German, Malygina, Tatiana, Ballester, Rubén, Nadimpalli, Kalyan, Nikitin, Alexander, Rabinowitz, Abraham, Salatiello, Alessandro, Scardapane, Simone, Scofano, Luca, Singh, Suraj, Sjölund, Jens, Snopov, Pavel, Spinelli, Indro, Telyatnikov, Lev, Testa, Lucia, Yang, Maosheng, Yue, Yixiao, Zaghen, Olga, Zia, Ali, Miolane, Nina
This paper presents the computational challenge on topological deep learning that was hosted within the ICML 2023 Workshop on Topology and Geometry in Machine Learning. The competition asked participants to provide open-source implementations of topological neural networks from the literature by contributing to the python packages TopoNetX (data processing) and TopoModelX (deep learning). The challenge attracted twenty-eight qualifying submissions in its two-month duration. This paper describes the design of the challenge and summarizes its main findings.
Topological Deep Learning: Going Beyond Graph Data
Hajij, Mustafa, Zamzmi, Ghada, Papamarkou, Theodore, Miolane, Nina, Guzmán-Sáenz, Aldo, Ramamurthy, Karthikeyan Natesan, Birdal, Tolga, Dey, Tamal K., Mukherjee, Soham, Samaga, Shreyas N., Livesay, Neal, Walters, Robin, Rosen, Paul, Schaub, Michael T.
Topological deep learning is a rapidly growing field that pertains to the development of deep learning models for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs, which generalize many domains encountered in scientific computations. In this paper, we present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains. Specifically, we first introduce combinatorial complexes, a novel type of topological domain. Combinatorial complexes can be seen as generalizations of graphs that maintain certain desirable properties. Similar to hypergraphs, combinatorial complexes impose no constraints on the set of relations. In addition, combinatorial complexes permit the construction of hierarchical higher-order relations, analogous to those found in simplicial and cell complexes. Thus, combinatorial complexes generalize and combine useful traits of both hypergraphs and cell complexes, which have emerged as two promising abstractions that facilitate the generalization of graph neural networks to topological spaces. Second, building upon combinatorial complexes and their rich combinatorial and algebraic structure, we develop a general class of message-passing combinatorial complex neural networks (CCNNs), focusing primarily on attention-based CCNNs. We characterize permutation and orientation equivariances of CCNNs, and discuss pooling and unpooling operations within CCNNs in detail. Third, we evaluate the performance of CCNNs on tasks related to mesh shape analysis and graph learning. Our experiments demonstrate that CCNNs have competitive performance as compared to state-of-the-art deep learning models specifically tailored to the same tasks. Our findings demonstrate the advantages of incorporating higher-order relations into deep learning models in different applications.