Skip to content

mlvc-lab/GLD

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Distilling Global and Local Logits with Densely Connected Relations

Official Pytorch implementation of "Distilling Global and Local Logits with Densely Connected Relations", ICCV 2021.

| paper | supplementary material |

This repository contains source code of CIFAR-100 experimental setup (a). We provide a pre-trained teacher weight in "teacher" directory, median of 3 runs for starting distillation without pre-training the teacher network. Training logs of distilled student are in "log" directory.

Setup (a) : Teacher (ResNet-110), Student (ResNet-20).

Requirements

  • Python3
  • PyTorch (> 1.0)
  • torchvision (> 0.2)
  • NumPy

Training a teacher network (If you need)

python3 ./train.py --model resnet --depth 110 

Distilling the teacher network to the student network

python3 ./distill.py --teacher resnet --student resnet --depth 110 --sdepth 20 --alpha 0.7 --beta 500. --div 2

Citation

@inproceedings{kim2021distilling,
  title={Distilling Global and Local Logits With Densely Connected Relations},
  author={Kim, Youmin and Park, Jinbae and Jang, YounHo and Ali, Muhammad and Oh, Tae-Hyun and Bae, Sung-Ho},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={6290--6300},
  year={2021}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%