Skip to content

Latest commit

 

History

History
44 lines (44 loc) · 1.79 KB

2008-07-09-welling08a.md

File metadata and controls

44 lines (44 loc) · 1.79 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_editor editor bibtex_author author date note address container-title volume genre issued pdf extras
Hybrid variational/gibbs collapsed inference in topic models
Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve test-set perplexity relative to variational inference at no computational cost.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
welling08a
0
Hybrid variational/gibbs collapsed inference in topic models
587
594
587-594
587
false
McAllester, David A. and Myllym{"a}ki, Petri
given family
David A.
McAllester
given family
Petri
Myllymäki
Welling, Max and Teh, Yee Whye and Kappen, Bert
given family
Max
Welling
given family
Yee Whye
Teh
given family
Bert
Kappen
2008-07-09
Reissued by PMLR on 30 October 2024.
Proceedings of the 24th Conference on Uncertainty in Artificial Intelligence
R6
inproceedings
date-parts
2008
7
9