Neural Structure Learning with Stochastic Differential Equations

Wang, Benjie, Jennings, Joel, Gong, Wenbo

arXiv.org Machine Learning 

Time-series data are ubiquitous in the real world, often comprising a series of data points recorded at varying time intervals. Understanding the underlying structures between variables associated with temporal processes is of paramount importance for numerous real-world applications (Spirtes et al., 2000; Berzuini et al., 2012; Peters et al., 2017). Although randomised experiments are considered the gold standard for unveiling such relationships, they are frequently hindered by factors such as cost and ethical concerns. Structure learning seeks to infer hidden structures from purely observational data, offering a powerful approach for a wide array of applications (Bellot et al., 2021; Löwe et al., 2022; Runge, 2018; Tank et al., 2021; Pamfil et al., 2020; Gong et al., 2022). However, many existing structure learning methods for time series are inherently discrete, assuming that the underlying temporal processes are discretized in time and requiring uniform sampling intervals throughout the entire time range. Consequently, these models face two key limitations: (i) they may misrepresent the true underlying process when it is continuous in time, potentially leading to incorrect inferred relationships; and (ii) they struggle with handling irregular sampling intervals, which frequently arise in fields such as biology (Trapnell et al., 2014; Qiu et al., 2017; Qian et al., 2020) and climate science (Bracco et al., 2018; Raia, 2008). Although there exists a previous work (Bellot et al., 2021) that also tries to infer the underlying structure from the continuous-time perspective, its framework based on ordinary differential equations (ODE)

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found