You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to find out how to store other user-defined parameters in a onnx model and then access them with c++ onnx runtime. Here is a detailed description.
I have the following neural network model in pytorch. It's a simple MLP with 2 hidden layers, 50 + 50 inputs and 3 outputs (because $p=q=1$):
Notice the attributes scale and offset in the above class? I want to use these attributes to scale and offset the output of my neural network during inference in c++. How can I do that?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I want to find out how to store other user-defined parameters in a onnx model and then access them with c++ onnx runtime. Here is a detailed description.
I have the following neural network model in pytorch. It's a simple MLP with 2 hidden layers, 50 + 50 inputs and 3 outputs (because$p=q=1$ ):
Notice the attributes
scale
andoffset
in the above class? I want to use these attributes to scale and offset the output of my neural network during inference in c++. How can I do that?Beta Was this translation helpful? Give feedback.
All reactions