Both source and target dimension have values but they differ #19935
Unanswered
Mugutech62
asked this question in
Mobile Q&A
Replies: 1 comment 1 reply
-
It would appear that as a result of changing the size of the kernel, the size of the output from your MaxPool might be impacted and probably is the reason that Concat is failing. You would need to update the model such that it doesn't affect other components in the model. I would suggest starting by making modifications to the original model in the original framework and then exporting the model to onnx. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello experts
I am presently working optimizing onnx runtimes by editing certain nodes using onnx_modifier, as per our requirement I have to edit maxpool node with kernal shape of 33, initially my model has shape of 55, after changing it I try to compile and make inference of model, But I could see some runtime errors. Kindly help me out to solve this issue.
Before making changes:
After making changes:
Error issue:
sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so) File "/home/mugu/edgeai-tidl-tools/examples/osrt_python/ort/onnxrt_ep.py", line 200, in run_model sess = rt.InferenceSession(config['model_path'] ,providers=EP_list, provider_options=[delegate_options, {}], sess_options=so) File "/home/mugu/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 362, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/mugu/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 362, in __init__ self._create_inference_session(providers, provider_options, disabled_optimizers) File "/home/mugu/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 399, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) File "/home/mugu/.local/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 399, in _create_inference_session sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model) onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/mugu/edgeai-tidl-tools/model/modified_best.onnx failed:Node (/model.9/Concat) Op (Concat) [ShapeInferenceError] Can't merge shape info. Both source and target dimension have values but they differ. Source=22 Target=20 Dimension=2 onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from /home/mugu/edgeai-tidl-tools/model/modified_best.onnx failed:Node (/model.9/Concat) Op (Concat) [ShapeInferenceError] Can't merge shape info. Both source and target dimension have values but they differ. Source=22 Target=20 Dimension=2
Thanks in advance!
Beta Was this translation helpful? Give feedback.
All reactions