You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It improves the AUROC OOD detection performance of a ResNet18 trained on ImageNet by almost 25% using the hard ImageNet-O as out-of-distribution.
All it is required is to replace the SoftMax loss (i.e., the combination of the linear output layer, the SoftMax activation, and the cross-entropy loss) with the DisMax loss (see details how it is done in the code below).
The code is basically ready. It is essentially a matter of integrating into this lib:
dlmacedo
changed the title
[FEATURE] Add Distinction Maximization Loss for improved OOD detection performance on ImageNet
[FEATURE] Add Distinction Maximization Loss to notably improve the OOD detection performance on ImageNet
Jul 29, 2022
dlmacedo
changed the title
[FEATURE] Add Distinction Maximization Loss to notably improve the OOD detection performance on ImageNet
[FEATURE] Add the Distinction Maximization Loss (DisMax) to notably improve the OOD detection performance on ImageNet
Jul 29, 2022
We suggest adding DisMax loss to improve OOD detection:
https://arxiv.org/abs/2205.05874
It improves the AUROC OOD detection performance of a ResNet18 trained on ImageNet by almost 25% using the hard ImageNet-O as out-of-distribution.
All it is required is to replace the SoftMax loss (i.e., the combination of the linear output layer, the SoftMax activation, and the cross-entropy loss) with the DisMax loss (see details how it is done in the code below).
The code is basically ready. It is essentially a matter of integrating into this lib:
https://github.com/dlmacedo/distinction-maximization-loss/
Using DisMax without FPR, which resulted in the above-mentioned result, no hyperparameter is required to be tuned.
The text was updated successfully, but these errors were encountered: