Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Initialization crash using different onnx model and different project #22961

Closed
ChenHackerGit opened this issue Nov 27, 2024 · 2 comments
Closed
Labels
build build issues; typically submitted using template ep:DML issues related to the DirectML execution provider .NET Pull requests that update .net code

Comments

@ChenHackerGit
Copy link

Describe the issue

I realized it step by step according to this tutorial, and he worked very well, so I can use directml for reasoning.https://learn.microsoft.com/en-au/windows/ai/models/get-started-onnx-winuiBut I created a new console program and imitated this process, and there was an error when loading the model.

  HResult=0x80131500
  Message=[ErrorCode:RuntimeException] Exception during initialization: 
  Source=Microsoft.ML.OnnxRuntime
  StackTrace:
   在 Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus)
   在 Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer)
   在 Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options)
   在 YoloParser.YoloV8Parser.InitModel() 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 50 行
   在 YoloParser.YoloV8Parser.Parse(Image`1 image) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 73 行
   在 Program.<Main>$(String[] args) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\Program.cs 中: 第 14 行

Then I put the model of the original tutorial of the onnx model, and it can be imported normally, or I use cpu to reason, and it can also be imported normally, so I suspect that it is a problem with my model. What is puzzling is that when I put the model that I reported the error before into the winui project that I imitated the tutorial, it can also be imported normally.

I use the latest and the same version for all packages, but why is this problem?

reference information

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net8.0</TargetFramework>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.ML.OnnxRuntime.DirectML" Version="1.20.1" />
    <PackageReference Include="SharpDX.DXGI" Version="4.2.0" />
    <PackageReference Include="SixLabors.ImageSharp" Version="3.1.6" />
  </ItemGroup>

  <ItemGroup>
    <Folder Include="model\" />
    <Folder Include="assets\" />
  </ItemGroup>

  <ItemGroup>
    <None Update="assets\1.jpg">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
    <None Update="model\yolov8.onnx">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
    <None Update="model\resnet50-v2-7.onnx">
      <CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
    </None>
  </ItemGroup>

</Project>

Urgency

No response

Target platform

windows x64

Build script

no

Error / output

Microsoft.ML.OnnxRuntime.OnnxRuntimeException
HResult=0x80131500
Message=[ErrorCode:RuntimeException] Exception during initialization:
Source=Microsoft.ML.OnnxRuntime
StackTrace:
在 Microsoft.ML.OnnxRuntime.NativeApiStatus.VerifySuccess(IntPtr nativeStatus)
在 Microsoft.ML.OnnxRuntime.InferenceSession.Init(String modelPath, SessionOptions options, PrePackedWeightsContainer prepackedWeightsContainer)
在 Microsoft.ML.OnnxRuntime.InferenceSession..ctor(String modelPath, SessionOptions options)
在 YoloParser.YoloV8Parser.InitModel() 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 50 行
在 YoloParser.YoloV8Parser.Parse(Image`1 image) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\YoloParser\YoloV8Parser.cs 中: 第 73 行
在 Program.

$(String[] args) 在 D:\Cznorth\Documents\code\net\dml_test\dml_test\Program.cs 中: 第 14 行

Visual Studio Version

No response

GCC / Compiler Version

No response

@ChenHackerGit ChenHackerGit added the build build issues; typically submitted using template label Nov 27, 2024
@github-actions github-actions bot added .NET Pull requests that update .net code ep:DML issues related to the DirectML execution provider labels Nov 27, 2024
@skottmckay
Copy link
Contributor

You could try setting the log severity to VERBOSE in the SessionOptions prior to creating the inference session. Maybe that will provide a hint.

@ChenHackerGit
Copy link
Author

You could try setting the log severity to VERBOSE in the SessionOptions prior to creating the inference session. Maybe that will provide a hint.

Thanks for your help, I have solved my problem now. I noticed the platform I debugged was "any cpu". I changed it to x64, and it worked. Although I copied the settings from "any cpu" when I created the x64 option, it was effective.
A pleasant discussion

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
build build issues; typically submitted using template ep:DML issues related to the DirectML execution provider .NET Pull requests that update .net code
Projects
None yet
Development

No branches or pull requests

2 participants