-
Notifications
You must be signed in to change notification settings - Fork 7.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support For TPU/XLA Devices #5635
Comments
It's good Idea. Now we have to do manual work to install Intel OneAPI, for windows VIsual Studio, set correct version pytorch... Exhausted. I am not sure that now it use shared GPU, where I have Intel UHD and Nvidia |
I'm experimenting to see if I can add support for TPU/XLA devices within the comfy code myself. If possible, I can try to open a PR to add support for it. I don't know if there are configurations of TPUs with GPUs? As for shared TPUs, the most common way I see is with SPMD or FSDPv2. I believe you can also do that with XMP? but I haven't had much luck with it using TPUv2-8 or TPUv3-8. Torch XLA: https://github.com/pytorch/xla |
hi @gabriel-montrose, I have just created a PR. #5657 |
If anyone needs TPU/XLA devices support for ComfyUI please refer to my own fork ComfyUI-TPU. I shall maintain it as long as there's interest. |
Feature Idea
Support For TPU/XLA Devices
Existing Solutions
No response
Other
No response
The text was updated successfully, but these errors were encountered: