Goto

Collaborating Authors

 dynamic normalizing flow


Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

Neural Information Processing Systems

Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.


Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

Neural Information Processing Systems

Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.


Review for NeurIPS paper: Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

Neural Information Processing Systems

Weaknesses: No Explanation of Transformations of Stochastic Processes: I was under the impression that transforming / reparameterizing a stochasic process is non-trivial. Thus, I was expecting Equation 7 to include a second derivative term. I'm not saying that Equation 7 is wrong, per se---transforming just the increments agrees with intuition. However, the problem is that the paper provides no explanation or mathematical references for stochastic processes and their transformations. There are *zero* citations in both Section 2.2 and Section 3.1.


Review for NeurIPS paper: Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

Neural Information Processing Systems

One reviewer recommend borderline rejection, but in my opinion the authors successfully addressed his concerns in the rebuttal. Recommendations: The authors are encouraged to clearly explain the reviewers' concern on potential similarities of the approach with the Kalman filter with nonlinear outputs. Also the issues related to background and related work and motivation for continuity.


Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows

Neural Information Processing Systems

Normalizing flows transform a simple base distribution into a complex target distribution and have proved to be powerful models for data generation and density estimation. In this work, we propose a novel type of normalizing flow driven by a differential deformation of the continuous-time Wiener process. As a result, we obtain a rich time series model whose observable process inherits many of the appealing properties of its base process, such as efficient computation of likelihoods and marginals. Furthermore, our continuous treatment provides a natural framework for irregular time series with an independent arrival process, including straightforward interpolation. We illustrate the desirable properties of the proposed model on popular stochastic processes and demonstrate its superior flexibility to variational RNN and latent ODE baselines in a series of experiments on synthetic and real-world data.