Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC
| Authors |
|
|---|---|
| Publication date | 2015 |
| Book title | KDD'15: proceedings of the 21st ACM SIGKDD International Conference on Knowledge Discovery and Data Mining: August 10-13, 2015, Sydney, Australia |
| ISBN |
|
| Event | Conference on Knowledge Discovery and Data Mining (KDD2015) |
| Pages (from-to) | 9-18 |
| Publisher | New York, NY: Association for Computing Machinery |
| Organisations |
|
| Abstract |
Despite having various attractive qualities such as high prediction accuracy and the ability to quantify uncertainty and avoid ovrfitting, Bayesian Matrix Factorization has not been widely adopted because of the prohibitive cost of inference. In this paper, we propose a scalable distributed Bayesian matrix factorization algorithm using stochastic gradient MCMC. Our algorithm, based on Distributed Stochastic Gradient Langevin Dynamics, can not only match the prediction accuracy of standard MCMC methods like Gibbs sampling, but at the same time is as fast and simple as stochastic gradient descent. In our experiments, we show that our algorithm can achieve the same level of prediction accuracy as Gibbs sampling an order of magnitude faster. We also show that our method reduces the prediction error as fast as distributed stochastic gradient descent, achieving a 4.1% improvement in RMSE for the Netflix dataset and an 1.8% for the Yahoo music dataset.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1145/2783258.2783373 |
| Downloads |
kdd15_dbmf_v0.07_submitted_arXiv
(Submitted manuscript)
|
| Permalink to this page | |