You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I realized it step by step according to this tutorial, and he worked very well, so I can use directml for reasoning.https://learn.microsoft.com/en-au/windows/ai/models/get-started-onnx-winuiBut I created a new console program and imitated this process, and there was an error when loading the model.
Then I put the model of the original tutorial of the onnx model, and it can be imported normally, or I use cpu to reason, and it can also be imported normally, so I suspect that it is a problem with my model. What is puzzling is that when I put the model that I reported the error before into the winui project that I imitated the tutorial, it can also be imported normally.
I use the latest and the same version for all packages, but why is this problem?
You could try setting the log severity to VERBOSE in the SessionOptions prior to creating the inference session. Maybe that will provide a hint.
Thanks for your help, I have solved my problem now. I noticed the platform I debugged was "any cpu". I changed it to x64, and it worked. Although I copied the settings from "any cpu" when I created the x64 option, it was effective.
A pleasant discussion
Describe the issue
I realized it step by step according to this tutorial, and he worked very well, so I can use directml for reasoning.https://learn.microsoft.com/en-au/windows/ai/models/get-started-onnx-winuiBut I created a new console program and imitated this process, and there was an error when loading the model.
Then I put the model of the original tutorial of the onnx model, and it can be imported normally, or I use cpu to reason, and it can also be imported normally, so I suspect that it is a problem with my model. What is puzzling is that when I put the model that I reported the error before into the winui project that I imitated the tutorial, it can also be imported normally.
I use the latest and the same version for all packages, but why is this problem?
reference information
Urgency
No response
Target platform
windows x64
Build script
no
Error / output
Microsoft.ML.OnnxRuntime.OnnxRuntimeException
$(String[] args) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\Program.cs 中: 第 14 行HResult=0x80131500
Message=[ErrorCode:RuntimeException] Exception during initialization:
Source=Microsoft.ML.OnnxRuntime
StackTrace:
在 Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus)
在 Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer)
在 Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options)
在 YoloParser.YoloV8Parser.InitModel() 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 50 行
在 YoloParser.YoloV8Parser.Parse(Image`1 image) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 73 行
在 Program.
Visual Studio Version
No response
GCC / Compiler Version
No response
The text was updated successfully, but these errors were encountered: