Scalable MCMC for Mixed Membership Stochastic Blockmodels
| Authors |
|
|---|---|
| Publication date | 2016 |
| Journal | JMLR Workshop and Conference Proceedings |
| Event | Conference on Artificial Intelligence and Statistics (AISTATS2016) |
| Volume | Issue number | 51 |
| Pages (from-to) | 723-731 |
| Organisations |
|
| Abstract | We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In addition we develop an approximation that can handle models that entertain a very large number of communities. The experimental results show that SG-MCMC strictly dominates competing algorithms in all cases. |
| Document type | Article |
| Note | Artificial Intelligence and Statistics, 9-11 May 2016, Cadiz, Spain |
| Language | English |
| Published at | http://jmlr.org/proceedings/papers/v51/li16d.html |
| Downloads |
li16d
(Final published version)
|
| Permalink to this page | |