Goto

Collaborating Authors

 mc-dropout




Practical Deep Learning with Bayesian Principles

Kazuki Osawa, Siddharth Swaroop, Mohammad Emtiyaz E. Khan, Anirudh Jain, Runa Eschenhagen, Richard E. Turner, Rio Yokota

Neural Information Processing Systems

Figure 2: distributed calculation algorithmic Momentum Itiswell improv to Adam, where 1isthemomentumµin in Adaminit.xavier_normalin V methods, and AUR andissecond-best significantly and Adam Wealsosho7] in Figures itscalibration ImageNet, required Wealso different protocol 16,31,8,32] tocompare Wealsoborro16,30], sho reporting Ideally, we data.







Multi-level Monte Carlo Dropout for Efficient Uncertainty Quantification

Pim, Aaron, Pryer, Tristan

arXiv.org Machine Learning

We develop a multilevel Monte Carlo (MLMC) framework for uncertainty quantification with Monte Carlo dropout. Treating dropout masks as a source of epistemic randomness, we define a fidelity hierarchy by the number of stochastic forward passes used to estimate predictive moments. We construct coupled coarse--fine estimators by reusing dropout masks across fidelities, yielding telescoping MLMC estimators for both predictive means and predictive variances that remain unbiased for the corresponding dropout-induced quantities while reducing sampling variance at fixed evaluation budget. We derive explicit bias, variance and effective cost expressions, together with sample-allocation rules across levels. Numerical experiments on forward and inverse PINNs--Uzawa benchmarks confirm the predicted variance rates and demonstrate efficiency gains over single-level MC-dropout at matched cost.