Variational Dropout and the Local Reparameterization Trick
| Authors |
|
|---|---|
| Publication date | 2015 |
| Host editors |
|
| Book title | 29th Annual Conference on Neural Information Processing Systems 2015 |
| Book subtitle | Montreal, Canada, 7-12 December 2015 |
| ISBN |
|
| Series | Advances in Neural Information Processing Systems |
| Event | Neural Information Processing Systems (NIPS2015) |
| Volume | Issue number | 3 |
| Pages (from-to) | 2575-2583 |
| Publisher | Red Hook, NY: Curran Associates |
| Organisations |
|
| Abstract |
We investigate a local reparameterizaton technique for greatly reducing the variance of stochastic gradients for variational Bayesian inference (SGVB) of a posterior over model parameters, while retaining parallelizability. This local reparameterization translates uncertainty about global parameters into local noise that is independent across datapoints in the minibatch. Such parameterizations can be trivially parallelized and have variance that is inversely proportional to the minibatch size, generally leading to much faster convergence. Additionally, we explore a connection with dropout: Gaussian dropout objectives correspond to SGVB with local reparameterization, a scale-invariant prior and proportionally fixed posterior variance. Our method allows inference of more flexibly parameterized posteriors; specifically, we propose variational dropout, a generalization of Gaussian dropout where the dropout rates are learned, often leading to better models. The method is demonstrated through several experiments.
|
| Document type | Conference contribution |
| Language | English |
| Published at | http://papers.nips.cc/paper/5666-variational-dropout-and-the-local-reparameterization-trick |
| Downloads |
5666-variational-dropout-and-the-local-reparameterization-trick
(Accepted author manuscript)
|
| Permalink to this page | |