Scalable Structure Learning of Continuous-Time Bayesian Networks from Incomplete Data
Linzner, Dominik, Schmidt, Michael, Koeppl, Heinz
–Neural Information Processing Systems
Continuous-time Bayesian Networks (CTBNs) represent a compact yet powerful framework for understanding multivariate time-series data. Given complete data, parameters and structure can be estimated efficiently in closed-form. However, if data is incomplete, the latent states of the CTBN have to be estimated by laboriously simulating the intractable dynamics of the assumed CTBN. This is a problem, especially for structure learning tasks, where this has to be done for each element of a super-exponentially growing set of possible structures. In order to circumvent this notorious bottleneck, we develop a novel gradient-based approach to structure learning.
Neural Information Processing Systems
Mar-18-2020, 22:01:04 GMT