Source Themes

Calibrating Deep Neural Networks using Focal Loss

Propose focal loss as an alternative to cross-entropy loss for training well-calibrated, confident and accurate neural networks.

On using Focal Loss for Neural Network Calibration

Propose focal loss as an alternative to cross-entropy loss for training well-calibrated, confident and accurate neural networks.

Evaluating Bayesian Deep Learning Methods for Semantic Segmentation

We propose new metrics to evaluate uncertainty estimates in semantic segmentation, evaluate Bayesian Deep Learning methods using these metrics and hence, create new benchmarks.

On the Importance of Strong Baselines in Bayesian Deep Learning

We re-evaluate MC Dropout by performing grid-search over dropout rates to generate significantly stronger baselines and compare MC Dropout with other state-of-the-art VI approaches.