Error when loading quantized model #13579
Unanswered
wangjia184
asked this question in
Other Q&A
Replies: 1 comment 1 reply
-
@yufenglee , would you please help? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a tensorflow V2 keras model, converted into onnx model and then preprocessed and quantized. Model files can be download here
Only when I load the quantized model, it failed with Error Node (QLinearConv_token_383)'s input 0 is marked single but has an empty string in the graph
Here is python code for convertion.
When I trying to load the quantized model,
CreateSession
C API failed with following error.But I can successfully load the preprocessed model. So the problem is from quantization. Please what is the issue?
Beta Was this translation helpful? Give feedback.
All reactions