nanodet model does not has CPU support for inference #907
Replies: 2 comments 3 replies
-
I'm not entirely sure, but I think it should also work on the CPU. I think that the issue you linked only specifies that training can't be done on the CPU, but in the case of the notebook you want inference, so I think it should work. The repository also includes OpenVINO section which also has the option to export to ONNX, so maybe look into that. |
Beta Was this translation helpful? Give feedback.
-
to be precise if we look here there we have cuda.sun I need to remove these lines so this works properly |
Beta Was this translation helpful? Give feedback.
-
For GSoC 2023, I have chosen the Nanodet model, which currently supports predictions only for GPU devices and not for CPU. I have identified an issue related to this limitation, which is indicated in the code through the use of ".cuda()".
Beta Was this translation helpful? Give feedback.
All reactions