CUED Publications database

Dropout as a Bayesian Approximation: Appendix

Gal, Y and Ghahramani, Z (2016) Dropout as a Bayesian Approximation: Appendix. 33rd International Conference on Machine Learning, ICML 2016, 3. pp. 1661-1680.

Full text not available from this repository.

Abstract

We show that a neural network with arbitrary depth and non-linearities, with dropout applied before every weight layer, is mathematically equivalent to an approximation to a well known Bayesian model. This interpretation might offer an explanation to some of dropout's key properties, such as its robustness to overfitting. Our interpretation allows us to reason about uncertainty in deep learning, and allows the introduction of the Bayesian machinery into existing deep learning frameworks in a principled way. This document is an appendix for the main paper "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning" by Gal and Ghahramani, 2015.

Item Type: Article
Subjects: UNSPECIFIED
Divisions: Div F > Computational and Biological Learning
Depositing User: Cron Job
Date Deposited: 17 Jul 2017 19:23
Last Modified: 16 Nov 2017 02:18
DOI: