This is the repository for our new paper OccamNets. In this paper, we apply Occam's razor to neural networks to use only the required network depth and required visual regions. This increases bias robustness.
./requirements.sh
- Specify the root directory (where the dataset/logs will be stored) in the
paths.root
entry insideconf/base_config.yaml
- Download BiasedMNISTv2 from: https://drive.google.com/file/d/1_77AKsY5MoYpDnXgNkjWi9n2_mfQBW-F/view?usp=sharing
- Provide the full path for Biased MNIST in
data_dir
insideconf/dataset/biased_mnist.yaml
- You can also generate Biased MNIST by using/modifying:
./scripts/biased_mnist/generate.sh
- Download the dataset from: https://github.com/Faruk-Ahmed/predictive_group_invariance
- Specify the location to the dataset in
data_dir
ofconf/dataset/coco_on_places.yaml
- We provide bash scripts to train OccamResNet and ResNet (including baselines and SoTA debiasing methods on both the architectures)
- Train baseline and SoTA methods on OccamResNet/ResNet using:
./scripts/{dataset}/{dataset_shortform}_{method}.sh
- E.g., To train
./scripts/biased_mnist/bmnist_occam.sh
trains OccamNet with BiasedMNIST
- Train baseline and SoTA methods on OccamResNet/ResNet using:
- Model definition: Find OccamNets in
models/occam_resnet.py
,occam_efficient_net.py
andoccam_mobile_net.py
. - Training script:
trainers/occam_trainer.py
. - Training configuration:
conf/trainer/occam_trainer.yaml
(all of these parameters can be overridden from command line)
@article{shrestha2022occamnets,
title={OccamNets: Mitigating Dataset Bias by Favoring Simpler Hypotheses},
author={Shrestha, Robik and Kafle, Kushal and Kanan, Christopher},
booktitle={European Conference on Computer Vision (ECCV)},
year={2022}
}
This work was supported in part by NSF awards #1909696 and #2047556.