Scalable Inference for Logistic-Normal Topic Models
–Neural Information Processing Systems
Logistic-normal topic models can effectively discover correlation structures among latent topics. However, their inference remains a challenge because of the non-conjugacy between the logistic-normal prior and multinomial topic mixing proportions. Existing algorithms either make restricting mean-field assumptions or are not scalable to large-scale applications. This paper presents a partially collapsed Gibbs sampling algorithm that approaches the provably correct distribution by exploring the ideas of data augmentation. To improve time efficiency, we further present a parallel implementation that can deal with large-scale applications and learn the correlation structures of thousands of topics from millions of documents. Extensive empirical results demonstrate the promise.
Neural Information Processing Systems
Mar-13-2024, 15:21:44 GMT
- Country:
- Asia
- China > Beijing
- Beijing (0.04)
- Middle East > Jordan (0.04)
- China > Beijing
- North America > United States
- Pennsylvania > Allegheny County > Pittsburgh (0.04)
- South America > Paraguay
- Asia
- Genre:
- Research Report > New Finding (0.34)