Skip to content

Latest commit

 

History

History
29 lines (25 loc) · 1.04 KB

File metadata and controls

29 lines (25 loc) · 1.04 KB

PGD-Implemented-Adversarial-attack-on-CIFAR10

An example code of implement of PGD and FGSM algorithm for adversarial attack

Pretrained model

The pretrained models is from here
Please download the pretrained models first and put them in the /cifar10_models/state_dicts as instruction in above link.

Prepare normal examples

Please prepare your cifar-10 normal example and put all the classes in different folders. That is:
| imgs/
| - frog
| -- frog1.png frog2.png ......
...
| - automobile
...

Generate adversarial example


$python3 main.py -I input_normal_examples_path -M model -T mode -O adversarial_examples_folder_name
model: vgg16_bn, resnet50, mobilenet_v2, densenet161
mode: PGD, FGSM

Investigate transferability


$python3 transferability.py -I input_normal_examples_path -O 1or0
O: if generate confusion table or not