Comparison of MC dropout and Bayesian Neural Networks

Speaker: Milad Zeraatpisheh (Faculty of Science, Technology and Medicine; University of Luxembourg)
Title: Comparison of MC dropout and Bayesian Neural Networks
Time: Wednesday, 2021.01.20, 10:00 a.m. (CET)
Place: fully virtual (contact Dr. Jakub Lengiewicz to register)
Format: 30 min. presentation + 30 min. discussion

Abstract: 

This presentation explores the comparative performance, theoretical underpinnings, and practical implications of Monte Carlo (MC) dropout and Bayesian Neural Networks (BNNs) as methods for uncertainty estimation in deep learning. MC dropout, a lightweight and widely adopted technique, approximates Bayesian inference by applying dropout during both training and inference, offering computational simplicity. Conversely, BNNs incorporate prior distributions over weights and leverage posterior inference, providing a more principled but computationally intensive approach. The talk highlights key differences between the two approaches.