Replies: 1 comment
-
Do you have full call stack? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I have an Onnx model that I originally converted from RLlib. I manage to infer this model from Python but I'm struggling to make it work in C++, I'm getting an
Ort::Exception at memory location
error atOrt::ThrowOnError(const OrtApi & ort, OrtStatus * status) Line 17
.Here is the (very simple) model in question.
Here is my working Python code:
Here is the C++ code that is not working:
If it helps here are the model inputs (only the first 3 should be necessary to be fed to the model):
and model outputs (I'm only interested in the
'learned-0/model/fc_out/BiasAdd:0'
output, I have also tried to run the inference for all outputs but it gave the same error):I'm particularly confused about this issue as I have inferred both using Python and C++ a different Onnx model before (with different inputs and outputs) and everything just worked fine.
Any help would be greatly appreciated.
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions