You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, the bn1 layer cannot be fused with another layer, and cannot be exported to onnx after quantization.
Second, using PReLU drops the quantized model accuracy close to zero. It has to do with something in the pytorch's own code. Replacing it with ReLU during training solves this issue.
My question to you is, did you come across these issues, and how did you solve them?
The text was updated successfully, but these errors were encountered:
First of all, thanks for your experiments and code!
I have tried quantizing arcface models before, and have run into two specific issues.
Both issues have to do with the code here- https://github.com/DavorJordacevic/IResNet-ArcFace-Knowledge-Distillation/blob/5a7ade543cd422b6b318c20c48794de317e63da7/backbones/iresnet.py#LL37C9-L44C29
First, the bn1 layer cannot be fused with another layer, and cannot be exported to onnx after quantization.
Second, using PReLU drops the quantized model accuracy close to zero. It has to do with something in the pytorch's own code. Replacing it with ReLU during training solves this issue.
My question to you is, did you come across these issues, and how did you solve them?
The text was updated successfully, but these errors were encountered: