Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does MACs and FLOPs count correctly for and INT8 quantized model? #202

Open
Abanoub-G opened this issue Apr 13, 2023 · 1 comment
Open

Comments

@Abanoub-G
Copy link

Hi,
I am trying to use the thop profile to measure MACs and FLOPs of a model before and after applying quantisation to the model.

  • Does the current implementation of measuring MACs count INT8 quantized parameters in a Quantized model or only floating points (FP)?

  • If the implementation counts both INT and FP, then the FLOPs calculation (FLOPs = 2 x MACs) based on this reply, will not be accurate as it counted both integer and floating point operations, whilst FLOPs should only count the FP operations?

  • In the other case where the implementation of MACs only counts FP and not INT, then the FLOPs calculation of (FLOPs = 2 x MACs) will be fine. But then how does one calculate the INT operations?

Any advice/feedback will be greatly appreciated.
Thanks

@HaoKang-Timmy
Copy link
Contributor

Sorry, currently we do not support density or quantized calculation. Might improve in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants