Minimum Volume Topic Modeling

Jang, Byoungwook, Hero, Alfred

arXiv.org Machine Learning 

We propose a new topic modeling procedure that takes advantage of the fact that the There are many extensions of LDA, including a nonparametric Latent Dirichlet Allocation (LDA) log likelihood extension based on the Dirichlet process function is asymptotically equivalent called Hierarchical Dirichlet Process (Teh et al., to the logarithm of the volume of the topic 2005), a correlated topic extension based on the logistic simplex. This allows topic modeling to be normal prior on the topic proportions (Lafferty reformulated as finding the probability simplex and Blei, 2006), and a time-varying topic modeling that minimizes its volume and encloses extension (Blei and Lafferty, 2006). There are the documents that are represented as distributions two main approaches for estimation of the parameters over words. A convex relaxation of probabilistic topic models: the variational of the minimum volume topic model optimization approximation popularized by Blei et al. (2003) and is proposed, and it is shown that the sampling based approach studied by Pritchard the relaxed problem has the same global et al. (2000).

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found