Can we achieve data parallelism and model parallelism in onnxruntime? #21928
deng-ShiFu
started this conversation in
Ideas / Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
We want to implement parallel inference on multiple FPGAs, such as data parallelism or model parallelism.We would like to know whether onnxruntime supports this kind of functionality.
If it is not supported, can we achieve this requirement by extending onnxruntime? If you have relevant ideas, can you briefly describe them so that we can be inspired?Thank you.
Beta Was this translation helpful? Give feedback.
All reactions