/LDA

Primary LanguagePythonGNU General Public License v2.0GPL-2.0

LDA

Python Implementation of LDA Using Gibbs Sampling: includes the learning algorithm and a test dataset.

Todo:

alpha, beta != 1

Reference

Y. Wang. Distributed Gibbs Sampling of Latent Topic Models: The Gritty Details. August 2008. [Implementation detail]

G. Heinrich. Parameter estimation for text analysis. Technical Report Fraunhofer IGD Darmstadt, Germany. version 2.9: 15 September 2009. [Tutorial with detailed mathematical derivation]

P. Resnik, E. Hardisty. Gibbs Sampling for the Uninitiated. Technical Report, CS-TR-4956, UMIACS-TR-2010-04, LAMP-TR-153. June 2010. [Tutorial on gibbs sampling]

T. L. Griffiths and M. Steyvers. Finding scientific topics. Proceedings of the National academy of Sciences of the United States of America, p. 5228--5235. National Acad Sciences, 2004. [Dataset]

D. Blei, A. Ng, and M. Jordan. Latent Dirichlet allocation. Journal of Machine Learning Research, 3:993–1022, January 2003. [LDA paper]

D. Blei. Course Note for COS597C in Princeton (Advanced Methods in Probabilistic Modeling): Mixed-membership Models. October, 2013. Link: http://www.cs.princeton.edu/courses/archive/fall11/cos597C/lectures/mixed-membership.pdf [Background, note: other notes may also be of interest, such as nonparametric bayesian, variational inference]

K. P. Murphy. Machine Learning: a Probabilistic Perspective. MIT Press, 2012.