Goto

Collaborating Authors

Bayesian nightmare. Solved!

#artificialintelligence

Who has not heard that Bayesian statistics are difficult, computationally slow, cannot scale-up to big data, the results are subjective; and we don't need it at all? Do we really need to learn a lot of math and a lot of classical statistics first before approaching Bayesian techniques. Why do the most popular books about Bayesian statistics have over 500 pages? Bayesian nightmare is real or myth? Someone once compared Bayesian approach to the kitchen of a Michelin star chef with high-quality chef knife, a stockpot and an expensive sautee pan; while Frequentism is like your ordinary kitchen, with banana slicers and pasta pots. People talk about Bayesianism and Frequentism as if they were two different religions. Does Bayes really put more burden on the data scientist to use her brain at the outset because Bayesianism is a religion for the brightest of the brightest?


Introduction to Bayesian Logistic Regression

#artificialintelligence

Let's review the concepts underlying Bayesian statistical analysis by walking through a simple classification model. The data come from the 1988 Bangladesh Fertility Survey, where 1934 observations were taken from women in urban and rural areas. The authors of the dataset, Mn and Cleland aimed to determine trends and causes of fertility as well as differences in fertility and child mortality. We will use the data in order to train a Bayesian logistic regression model that can predict if a given woman uses contraception. The dataset is well suited to Bayesian logistic regression because being able to quantify uncertainty when analyzing fertility is the major component of population dynamics that decide the size, structure, and composition of populations (source 1, source 2).


Markov Chain Monte Carlo for Bayesian Inference - The Metropolis Algorithm - QuantStart

#artificialintelligence

In previous discussions of Bayesian Inference we introduced Bayesian Statistics and considered how to infer a binomial proportion using the concept of conjugate priors. We discussed the fact that not all models can make use of conjugate priors and thus calculation of the posterior distribution would need to be approximated numerically. In this article we introduce the main family of algorithms, known collectively as Markov Chain Monte Carlo (MCMC), that allow us to approximate the posterior distribution as calculated by Bayes' Theorem. In particular, we consider the Metropolis Algorithm, which is easily stated and relatively straightforward to understand. It serves as a useful starting point when learning about MCMC before delving into more sophisticated algorithms such as Metropolis-Hastings, Gibbs Samplers and Hamiltonian Monte Carlo. Once we have described how MCMC works, we will carry it out using the open-source PyMC3 library, which takes care of many of the underlying implementation details, allowing us to concentrate on Bayesian modelling.


Linear, Machine Learning and Probabilistic Approaches for Time Series Analysis

@machinelearnbot

In this post, we consider different approaches for time series modeling. The forecasting approaches using linear models, ARIMA alpgorithm, XGBoost machine learning algorithm are described. Results of different model combinations are shown. For probabilistic modeling the approaches using copulas and Bayesian inference are considered.


Linear, Machine Learning and Probabilistic Approaches for Time Series Analysis

@machinelearnbot

In this post, we consider different approaches for time series modeling. The forecasting approaches using linear models, ARIMA alpgorithm, XGBoost machine learning algorithm are described. Results of different model combinations are shown. For probabilistic modeling the approaches using copulas and Bayesian inference are considered. Time series analysis, especially forecasting, is an important problem of modern predictive analytics.