Dropout Tutorial In PyTorch Save

Dropout as Regularization and Bayesian Approximation

Project README

Tutorial: Dropout as Regularization and Bayesian Approximation

This tutorial aims to give readers a complete view of dropout, which includes the implementation of dropout (in PyTorch), how to use dropout and why dropout is useful. Basically, dropout can (1) reduce overfitting (so test results will be better) and (2) provide model uncertainty like Bayesian models we see in the class (Bayesian Approximation).

Please view our tutorial here.

References

[1] Improving neural networks by preventing co-adaptation of feature detectors, G. E. Hinton, et al., 2012
[2] Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning, Y. Gal, and Z. Ghahramani, 2016
[3] Dropout: A Simple Way to Prevent Neural Networks from Overfitting, N. Srivastava, et al., 2014

Open Source Agenda is not affiliated with "Dropout Tutorial In PyTorch" Project. README Source: xuwd11/Dropout_Tutorial_in_PyTorch
Stars
53
Open Issues
0
Last Commit
5 years ago
License

Open Source Agenda Badge

Open Source Agenda Rating