Goto

Collaborating Authors

A Bayesian estimation approach to analyze non-Gaussian data-generating processes with latent classes

arXiv.org Machine Learning

A large amount of observational data has been accumulated in various fields in recent times, and there is a growing need to estimate the generating processes of these data. A linear non-Gaussian acyclic model (LiNGAM) based on the non-Gaussianity of external influences has been proposed to estimate the data-generating processes of variables. However, the results of the estimation can be biased if there are latent classes. In this paper, we first review LiNGAM, its extended model, as well as the estimation procedure for LiNGAM in a Bayesian framework. We then propose a new Bayesian estimation procedure that solves the problem.


Novel approach to nonlinear/non-Gaussian Bayesian state estimation

Classics

"An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters. The required density of the state vector is represented as a set of random samples, which are updated and propagated by the algorithm. The method is not restricted by assumptions of linearity or Gaussian noise: it may be applied to any state transition or measurement model. A simulation example of the bearings only tracking problem is presented. This simulation includes schemes for improving the efficiency of the basic algorithm. For this example, the performance of the bootstrap filter is greatly superior to the standard extended Kalman filter" IEEE Proceedings For Radar and Signal Processing 140 (2), 107–113


Estimating High-dimensional Non-Gaussian Multiple Index Models via Stein's Lemma

Neural Information Processing Systems

We consider estimating the parametric components of semiparametric multi-index models in high dimensions. To bypass the requirements of Gaussianity or elliptical symmetry of covariates in existing methods, we propose to leverage a second-order Stein's method with score function-based corrections. We prove that our estimator achieves a near-optimal statistical rate of convergence even when the score function or the response variable is heavy-tailed. To establish the key concentration results, we develop a data-driven truncation argument that may be of independent interest. Papers published at the Neural Information Processing Systems Conference.


State Space Gaussian Processes with Non-Gaussian Likelihood

arXiv.org Machine Learning

We provide a comprehensive overview and tooling for GP modeling with non-Gaussian likelihoods using state space methods. The state space formulation allows for solving one-dimensional GP models in $\mathcal{O}(n)$ time and memory complexity. While existing literature has focused on the connection between GP regression and state space methods, the computational primitives allowing for inference using general likelihoods in combination with the Laplace approximation (LA), variational Bayes (VB), and assumed density filtering (ADF) / expectation propagation (EP) schemes has been largely overlooked. We present means of combining the efficient $\mathcal{O}(n)$ state space methodology with existing inference methods. We also further extend existing methods, and provide unifying code implementing all approaches.


State estimation under non-Gaussian Levy noise: A modified Kalman filtering method

arXiv.org Machine Learning

The Kalman filter is extensively used for state estimation for linear systems under Gaussian noise. When non-Gaussian L\'evy noise is present, the conventional Kalman filter may fail to be effective due to the fact that the non-Gaussian L\'evy noise may have infinite variance. A modified Kalman filter for linear systems with non-Gaussian L\'evy noise is devised. It works effectively with reasonable computational cost. Simulation results are presented to illustrate this non-Gaussian filtering method.