modeling continuous stochastic process
Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.
Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows Supplementary Materials Ruizhi Deng 1,2 Bo Chang 1 Marcus A. Brubaker 1,3,4 Greg Mori
We base the justification on the following two propositions. Work developed during an internship at Borealis AI. Poisson processes for training and test. For the mixture of OU processes (MOU), we sample 5000 sequences from each of two different OU processes and mix them to obtain 10000 sequences. As mentioned in Section 5.2 of the paper, we compare our models against the baselines on three datasets: Mujoco-Hopper, Beijing Air-Quality dataset (BAQD), and PTB Diagnostic The sequence length of the Mujoco-Hopper dataset is 200 and the sequence length of BAQD is 168.
Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.
Review for NeurIPS paper: Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
Weaknesses: No Explanation of Transformations of Stochastic Processes: I was under the impression that transforming / reparameterizing a stochasic process is non-trivial. Thus, I was expecting Equation 7 to include a second derivative term. I'm not saying that Equation 7 is wrong, per se---transforming just the increments agrees with intuition. However, the problem is that the paper provides no explanation or mathematical references for stochastic processes and their transformations. There are *zero* citations in both Section 2.2 and Section 3.1.
Review for NeurIPS paper: Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
One reviewer recommend borderline rejection, but in my opinion the authors successfully addressed his concerns in the rebuttal. Recommendations: The authors are encouraged to clearly explain the reviewers' concern on potential similarities of the approach with the Kalman filter with nonlinear outputs. Also the issues related to background and related work and motivation for continuity.
Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.
Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows
Deng, Ruizhi, Chang, Bo, Brubaker, Marcus A., Mori, Greg, Lehrmann, Andreas
Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.
- Asia (0.14)
- Europe > United Kingdom > England (0.14)
- Health & Medicine (0.68)
- Energy > Oil & Gas (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models (0.68)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Mathematical & Statistical Methods (0.64)