Stochastic Collapsed Variational Inference for Hidden Markov Models
Hidden Markov models (HMMs) [1] are popular probabilistic models for modelling sequential data in a variety of fields including natural language processing, speech recognition, weather forecasting, financial prediction and bioinformatics. However, their traditional inference methods such as vari-ational inference (VI) [2] and Markov chain Monte Carlo (MCMC) [3] are not readily scalable to large datasets. For example, one dataset in our experiment consists of 100 million observations. An important milestone for scaling VI was made by Hoffman et al. [4], who proposed stochastic VI (SVI) that computes cheap gradients based on minibatches of data, updating the model parameters before a complete pass of the full dataset. A recent scalable and more accurate algorithm was proposed by Foulds et al. [5], who applied such stochastic optimization to the collapsed latent Dirichlet allocation (LDA) [6], and their stochastic collapsed variational inference (SCVI) algorithm has been successful in large scale topic modelling.
Dec-5-2015
- Country:
- Asia > Middle East
- Jordan (0.04)
- Republic of Türkiye > Istanbul Province
- Istanbul (0.04)
- Europe
- Middle East > Republic of Türkiye
- Istanbul Province > Istanbul (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.04)
- Middle East > Republic of Türkiye
- North America
- Canada > Quebec
- Montreal (0.04)
- United States
- Arizona > Maricopa County
- Scottsdale (0.04)
- New Jersey > Hudson County
- Secaucus (0.04)
- Virginia > Arlington County
- Arlington (0.04)
- Arizona > Maricopa County
- Canada > Quebec
- South America > Paraguay
- Asia > Middle East
- Genre:
- Research Report (0.70)
- Technology: