Skip to content

Employ contrastive learning to enhance the ResNet-50 performace for skin lesion classification.

License

Notifications You must be signed in to change notification settings

LucasCwy/contrastive-resnet50

Repository files navigation


Contrastive ResNet-50

Abstract

Contrastive learning is a widely adopted technique for training models to encode representations by maximizing the dissimilarity between differently augmented views of the same data point, while minimizing the similarity between representations of different data points. This approach aims to leverage the inherent structure within the data and is particularly effective in scenarios with limited labeled data. In this study, we utilize SimCLR, a prominent framework in the field of contrastive learning, as a pre-training step to acquire meaningful representations from unlabeled skin lesion images. Through experimental evaluations conducted on the ISIC dataset, we demonstrate significant enhancements in accuracy and robustness compared to traditional supervised learning approaches.


References

SimCLR Paper: SimCLR
SupContrast Paper: Supervised Contrastive Learning
Code: SupContrast


Dependencies

You can install all the packages via

pip install -r requirements.txt

Instructions

Run the following command to split the dataset into train, validation, and test sets.

python3 data_split.py

Then run the following command to train and infer the models.

python3 main.py

Loss function

$\LARGE Contrastive Loss = l_{i,j} = \frac{{-\log\left(\exp\left({z_i^T z_j}/{\tau}\right)\right)}}{{\sum_{k=1}^{2N}\mathbb{1}_{[k\neq i]} \exp\left({z_i^T z_k}/{\tau}\right)}}$


Results

Model Test accuracy Test AUC
Baseline 79.4 73.1
SimCLR (100 epoch) 83.1 78.5
SimCLR (300 epoch) 81.5 76.6
SimCLR (500 epoch) 80.7 74.2

License

This project is licensed under the MIT License - see the LICENSE file for details.


About

Employ contrastive learning to enhance the ResNet-50 performace for skin lesion classification.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages