How to check which CPU ISA (AVX/AVX512/AMX) was used in inference #25925
Unanswered
TuongNguyen94
asked this question in
EP Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all,
I am doing a project which aims to run a model on Intel's AMX. I quantized the model, built Onnxruntime with --use_dnnl flag then deployed the built library. However, when I run inference in C++, I cannot find a way to verify that the model uses AMX. Is there any way to find this out? I have tried Ort logs but seen nothing beside "provider" : "CPUExecutionProvider"
Beta Was this translation helpful? Give feedback.
All reactions