Goto

Collaborating Authors

 topic simplex



Reviews: Geometric Dirichlet Means Algorithm for topic inference

Neural Information Processing Systems

I like this paper for two different reasons. After RecoverKL and the spectral algorithm, this paper brings a very novel and useful perspective into the topic inference problem for LDA, without apparently making strong assumptions about topics, such as separability via anchor words, etc. Secondly, it seems to be extremely good in practice meeting the speed of RecoverKL with the accuracy of Gibbs sampling algorithms. A. The algorithm: Aspects of this work were known before. For example, Blei pointed out the convex geometry in the original LDA paper, and the connection between LDA/NMF and K-Means was also known. However, the novel aspect of this paper is that it has used these connections to propose an inference algorithm for LDA completely based on the geometry of the topic and word simplexes. This is done by making an additional connection between the topic inference problem and that of Centroidal Voronoi Tesselations of a convex simplex.


Conic Scan-and-Cover algorithms for nonparametric topic modeling

Mikhail Yurochkin, Aritra Guha, XuanLong Nguyen

Neural Information Processing Systems

We propose new algorithms for topic modeling when the number of topics is unknown. Our approach relies on an analysis of the concentration of mass and angular geometry of the topic simplex, a convex polytope constructed by taking the convex hull of vertices representing the latent topics. Our algorithms are shown in practice to have accuracy comparable to a Gibbs sampler in terms of topic estimation, which requires the number of topics be given. Moreover, they are one of the fastest among several state of the art parametric techniques.


Minimum Volume Topic Modeling

Jang, Byoungwook, Hero, Alfred

arXiv.org Machine Learning

We propose a new topic modeling procedure that takes advantage of the fact that the There are many extensions of LDA, including a nonparametric Latent Dirichlet Allocation (LDA) log likelihood extension based on the Dirichlet process function is asymptotically equivalent called Hierarchical Dirichlet Process (Teh et al., to the logarithm of the volume of the topic 2005), a correlated topic extension based on the logistic simplex. This allows topic modeling to be normal prior on the topic proportions (Lafferty reformulated as finding the probability simplex and Blei, 2006), and a time-varying topic modeling that minimizes its volume and encloses extension (Blei and Lafferty, 2006). There are the documents that are represented as distributions two main approaches for estimation of the parameters over words. A convex relaxation of probabilistic topic models: the variational of the minimum volume topic model optimization approximation popularized by Blei et al. (2003) and is proposed, and it is shown that the sampling based approach studied by Pritchard the relaxed problem has the same global et al. (2000).


Conic Scan-and-Cover algorithms for nonparametric topic modeling

Yurochkin, Mikhail, Guha, Aritra, Nguyen, XuanLong

Neural Information Processing Systems

We propose new algorithms for topic modeling when the number of topics is unknown. Our approach relies on an analysis of the concentration of mass and angular geometry of the topic simplex, a convex polytope constructed by taking the convex hull of vertices representing the latent topics. Our algorithms are shown in practice to have accuracy comparable to a Gibbs sampler in terms of topic estimation, which requires the number of topics be given. Moreover, they are one of the fastest among several state of the art parametric techniques. Statistical consistency of our estimator is established under some conditions.


Conic Scan-and-Cover algorithms for nonparametric topic modeling

Yurochkin, Mikhail, Guha, Aritra, Nguyen, XuanLong

arXiv.org Machine Learning

We propose new algorithms for topic modeling when the number of topics is unknown. Our approach relies on an analysis of the concentration of mass and angular geometry of the topic simplex, a convex polytope constructed by taking the convex hull of vertices representing the latent topics. Our algorithms are shown in practice to have accuracy comparable to a Gibbs sampler in terms of topic estimation, which requires the number of topics be given. Moreover, they are one of the fastest among several state of the art parametric techniques. Statistical consistency of our estimator is established under some conditions.