Skip to content

Official code of MoSA (Mixture of Sparse Adapters).

License

Notifications You must be signed in to change notification settings

Theia-4869/MoSA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Mixture of Sparse Adapters

This repository contains the official PyTorch implementation for MoSA.

MoSA_demo

Environment setup

See env_setup.sh

Datasets preperation

  • Fine-Grained Visual Classification (FGVC): The datasets can be downloaded following the official links. We split the training data if the public validation set is not available. The splitted dataset can be found here: Dropbox, Google Drive.

  • Visual Task Adaptation Benchmark (VTAB): See VTAB_SETUP.md for detailed instructions and tips.

  • General Image Classification Datasets (GICD): The datasets will be automatically downloaded when you run an experiment using them via MoSA.

Pre-trained weights preperation

Download and place the pre-trained Transformer-based backbones to MODEL.MODEL_ROOT.

Pre-trained Backbone Link md5sum
ViT-B/16 link d9715d
ViT-L/16 link 8f39ce
Swin-B link bf9cc1

Training

To fine-tune a pre-trained ViT model via MoSA on FGVC-cub, you can run:

bash scripts/mosa/vit/FGVC/cub.sh

License

The majority of MoSA is licensed under the CC-BY-NC 4.0 license (see LICENSE for details).

About

Official code of MoSA (Mixture of Sparse Adapters).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published