Triton Inference Server #149
Unanswered
B-M-S-West
asked this question in
Q&A
Replies: 3 comments
-
hi, yes - join tensorrt llm channel on discord for more infos: https://discord.gg/xUEggt5xqn |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hey @geraldstanje has there been any update on this? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Just bumping this—any more info? I run into this error converting from onnx regardless of whether dimensions are fixed or dynamic:
Had to lowercase the equation in the final einsum to get there. Any help would be appreciated! Love the project. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Anyone tried to use the python backend and deploy this on the triton inference server?
Beta Was this translation helpful? Give feedback.
All reactions