Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use protobin and caffemodel to inference, the inference result is not what i want? #61

Open
peterpaniff opened this issue Dec 12, 2018 · 1 comment

Comments

@peterpaniff
Copy link

First i use prototxt and caffemodel to inference, i get the right result, but the compute cost is too high in my cellphone. when i change it to protobin and caffemodel to inference, the result is wrong? the confidence is alway 1, any ideas?

@solrex
Copy link
Owner

solrex commented Dec 13, 2018

@peterpaniff The latest version does not support prototxt, how can you be able to use it to inference?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants