-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Object detection API #132
Comments
There is nothing preventing integrating and experimenting with pruning with the object detection API. The object detection project internally has been mostly migrated to work with Keras (e.g. this sample usage). The said, there are probably ways we can make the integration easier. Someone is experimenting with applying pruning to the object detection API. Will bring him into this thread. |
Hi, I'm experimenting pruning for object detection. I'll write more information on this thread when I make some progress. Thank you for your attention. |
Thanks for your answer. |
@alanchiao any updates? |
@Xhark has been continuing to run the experiments. We'll give an update when we see clear good or bad signs. He has been working on other efforts at the same time (e.g. #133). If getting object detection models with smaller storage space usage is a need of yours (see #173 for considerations), please do feel free to give it try. There are ongoing efforts that will make it easier (#155, #133) as they are finished. |
I ran into this same problem a while ago. I eventually wrote a quick patch to the TF Object Detection API and the old I pushed an example to github here: https://github.com/panchgonzalez/tf_object_detection_pruning Using Keras and |
@panchgonzalez your patch is not complete. It's missing changes in
It's looks like the variable |
Hi @anshkumar, thanks for pointing this out. It looks like I didn't include the changes to Re: the runtime error you mentioned above. It looks like some changes to TF1.15 are causing this error, but I haven't quite figured out what exactly. The error goes away when downgrading to TF1.14. I did a quick test after these changes and was able to successfully prune a MaskRCNN-InceptionV2 model |
@panchgonzalez I got it somehow working. One major thing was downgrading the version of numpy to 1.16. Also, did you find any improvement in speed or the size of model after pruning ? EDIT It's mentioned here that:
In my case I'm not finding any improvement in speed. I'm using NVIDIA V100 for inference. Is there any special package requirement for inferencing a sparse tensor ? |
@anshkumar, glad to see you got it to work. Re: speed improvements -- that statement from Optimized inference engines (e.g., Nvidia's TensorRT framework) might be your best bet. |
@panchgonzalez I thought so. Anyway thanks for the help. |
@Xhark can you update? |
Hello all,
I was wondering if your pruning tools could be used on my object detection models (SSD / Faster RCNN, ...) trained with the tensorflow object detection API.
When training, I don't use directly Keras, but I follow tutorials of the object detection API.
Thanks for your help :)
The text was updated successfully, but these errors were encountered: