You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was trying to compile a custom PyTorch Transformer model on the DeepSparse Engine but kept crashing when compiling the model. I faced the same problem when I used the vanilla Transformer Encoder from PyTorch. Is there no support for Transformer only models? FYI, I tried the same steps for pretrained AlexNet from PyTorch and it worked. My steps are shown below:
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello!
I was trying to compile a custom PyTorch Transformer model on the DeepSparse Engine but kept crashing when compiling the model. I faced the same problem when I used the vanilla Transformer Encoder from PyTorch. Is there no support for Transformer only models? FYI, I tried the same steps for pretrained AlexNet from PyTorch and it worked. My steps are shown below:
Using:
DeepSparse 1.2.0
Torch 1.12.1+cu113
Python 3.7.15
Instantiating model:
Convert to ONNX:
Compile and run on DeepSparse (notebook crashes at compile_model):
Converted ONNX Graph, using
print(onnx.helper.printable_graph(onnx_model.graph))
:Beta Was this translation helpful? Give feedback.
All reactions