Temporal Graph ODEs for Irregularly-Sampled Time Series

Gravina, Alessio, Zambon, Daniele, Bacciu, Davide, Alippi, Cesare

arXiv.org Artificial Intelligence 

Modern graph representation learning works Some recent works propose to model input-output data relations mostly under the assumption of dealing with regularly as a continuous dynamic described by a learnable ordinary sampled temporal graph snapshots, which is differential equation (ODE), instead of discrete sequences far from realistic, e.g., social networks and physical of layers commonly used in deep learning. Neural systems are characterized by continuous dynamics ODE-based approaches have been exploited to model and sporadic observations. To address this non-temporal data, including message-passing functions for limitation, we introduce the Temporal Graph Ordinary learning node-level embeddings [Poli et al., 2019; Chamberlain Differential Equation (TG-ODE) framework, et al., 2021; Eliasof et al., 2021; Rusch et al., 2022; which learns both the temporal and spatial dynamics Gravina et al., 2023]. Notably, relying on ODEs has shown from graph streams where the intervals between promising for modeling complex temporal patterns from irregularly observations are not regularly spaced. We empirically and sparsely sampled data [Chen et al., 2018; validate the proposed approach on several Rubanova et al., 2019; Kidger et al., 2020].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found