You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Show that the use of dropout (and its variants) in NNs can be interpreted as a Bayesian approximation of a well known probabilistic model: the Gaussian process (GP)
Comparison with previous researches. What are the novelties/good points?
Key points
How the author proved effectiveness of the proposal?
Any discussions?
What should I read next?
The text was updated successfully, but these errors were encountered:
Summary
Link
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Author/Institution
What is this
Show that the use of dropout (and its variants) in NNs can be interpreted as a Bayesian approximation of a well known probabilistic model: the Gaussian process (GP)
Comparison with previous researches. What are the novelties/good points?
Key points
How the author proved effectiveness of the proposal?
Any discussions?
What should I read next?
The text was updated successfully, but these errors were encountered: