-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Mt5 model loading fails #57
Comments
are you using it for GPU? |
yes, but i tried with CPU. Still it fails |
sorry, the library does not support GPU yet, but the issue is similar to microsoft/onnxruntime#3113 for CPU are you facing the same issue? |
Yes on cpu too same issue |
it looks like the issue is in onnxruntime itself, I suggest you to create an issue there. |
i found some more logs as, |
could you please provide the reproducible code and full log of error |
I am not sure about code but you can use following model, i am using example mentioned in main readme page |
Also i was able to get soultion for this by onnx github, there is new tool for converting model from mt5 models |
I am facing input name issue as |
i tried different approch, now it gives out |
Hallo, I have MT5 pretrained model, i am using fastt5 approch to convert the model to onnx. The convestion of the model works fine. But when creating the decoder_sess at
decoder_sess = InferenceSession(str(path_to_decoder))
more specfic it fails atit fails without any error, as
Process finished with exit code 135 (interrupted by signal 7: SIGEMT)
Loading the encoder model works, but not decoder model
I am using latest version of fastt5==0.1.4
Any ideas to create session.
The text was updated successfully, but these errors were encountered: