Skip to content

Latest commit

 

History

History
48 lines (29 loc) · 1.93 KB

File metadata and controls

48 lines (29 loc) · 1.93 KB

Subcenter ArcFace

1. Motivation

We introduce one extra hyperparameter (subcenter number loss_K) to ArcFace to relax the intra-class compactness constraint. In our experiments, we find loss_K=3 can achieve a good balance between accuracy and robustness.

difference

2. Implementation

The training process of Subcenter ArcFace is almost same as ArcFace The increased GPU memory consumption can be easily alleviated by our parallel framework.

framework

3. Training Dataset

  1. MS1MV0 (The noise rate is around 50%), download link (baidu drive, code 8ql0) (dropbox)

4. Training Steps

1). Train Sub-center ArcFace (loss_K=3) on MS1MV0.

2). Drop non-dominant subcenters and high-confident noisy data (>75 degrees).

python drop.py --data <ms1mv0-path> --model <step-1-pretrained-model> --threshold 75 --k 3 --output <ms1mv0-drop75-path>

3). Train ArcFace on the new MS1MV0-Drop75 dataset.

5. Pretrained Models and Logs

baidu drive code 3jsh. gdrive

Citation

If you find Sub-center ArcFace useful in your research, please consider to cite the following related papers:

@inproceedings{deng2020subcenter,
  title={Sub-center ArcFace: Boosting Face Recognition by Large-scale Noisy Web Faces},
  author={Deng, Jiankang and Guo, Jia and Liu, Tongliang and Gong, Mingming and Zafeiriou, Stefanos},
  booktitle={Proceedings of the IEEE Conference on European Conference on Computer Vision},
  year={2020}
}