Skip to content

This is the official repository of L-DAWA: Layer-wise Divergence Aware Weight Aggregation in Federated Self-Supervised Visual Representation Learning github

Notifications You must be signed in to change notification settings

yasar-rehman/L-DAWA

Repository files navigation

L-DAWA

This repository contains the source code for L-DAWA: Layer-wise Divergence Aware Weight Aggregation in Federated Self-Supervised Visual Representation Learning that has been accepted in ICCV-2023.

Paper, Supplementary materials

Authors

Requirements:

  • For a complete list of the required packages please see the requirement.txt file. One can easily install, all the requirements by running pip install -r requirement.txt.

Tutorials and Helping materials

How to run the code

  • Go to the folder FL_Pretraining and run the pretrain_FL.py script.
  • To execute the finetuning run the finetune_script.py script.

Datasets

  • CIFAR-10, CIFAR-100, Tiny-ImageNet

Way forward

Supervised FL

  • Although our main motivation for designing L-DAWA was to tackle the client bias in Cross-Silo FL scenarios, in which each client runs a Self Supervised Learning (SSL) algorithm. L-DAWA can equally work well in Cross-Silo FL scenarios with Supervised Learning (SL) algorithms as shown below (See. Supplementary materials for the details.)

    Aggregation Type E1 E5 E10
    FedAvg 77.91 83.76 81.31
    FedYogi 77.49 72.50 74.85
    FedProx 80.55 74.87 72.24
    L-DAWA 81.96 84.68 82.35

Performance with Large Models

The results below are obtained by pertaining SimCLR in Cross-Silo FL settings first followed by linear-finetuning.

Results on CIFAR-100

Method Architecture E1 E5 E10
FedAvg ResNet34 54.76 69.81 74.21
FedU ResNet34 52.85 67.84 72.21
L-DAWA ResNet34 61.92 73.33 77.30
FedAvg ResNet50 63.62 75.39 79.41
FedU ResNet50 57.61 71.44 76.85
L-DAWA ResNet50 63.90 75.58 79.11

Results on Tiny-ImageNet

Method Architecture E1
FedAvg ResNet34 11.93
FedU ResNet34 11.75
L-DAWA ResNet34 18.64
FedAvg ResNet50 13.51
FedU ResNet50 13.22
L-DAWA ResNet50 19.04

News

  • Added the source code for L-DAWA
  • Added the inventory folder
  • Added the FL_pretraining script
  • The build-up of this repository is in progress

Issues:

If you encounter any issues, feel free to open an issue in the GitHub.

Citations

@inproceedings{rehman2023dawa,
  title={L-DAWA: Layer-wise Divergence Aware Weight Aggregation in Federated Self-Supervised Visual Representation Learning},
  author={Rehman, Yasar Abbas Ur and Gao, Yan and de Gusmao, Pedro Porto Buarque and Alibeigi, Mina and Shen, Jiajun and Lane, Nicholas D},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  pages={16464--16473},
  year={2023}
}

About

This is the official repository of L-DAWA: Layer-wise Divergence Aware Weight Aggregation in Federated Self-Supervised Visual Representation Learning github

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages