Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is CHW format mandatory in inference ? #10

Open
aidamian opened this issue Feb 7, 2018 · 2 comments
Open

Is CHW format mandatory in inference ? #10

aidamian opened this issue Feb 7, 2018 · 2 comments

Comments

@aidamian
Copy link

aidamian commented Feb 7, 2018

So, the model was loaded/trained with Keras/Tensorflow and for sure input is HWC however during inference I see both in testing and conversion scrips we are forcing CHW input. Why is that and how is that possible?

@faustomilletari
Copy link
Contributor

faustomilletari commented Feb 7, 2018 via email

@aidamian
Copy link
Author

aidamian commented Feb 7, 2018

In understand. Does this mean when input tensor if 4D the convertor automatically assumes swapping of dim 2 and dim 4 (certainly for the case when input is tensorflow frozen graphs I mean) ?

Also, why here
https://devtalk.nvidia.com/default/topic/1025594/how-to-feed-a-3-channel-image-to-tensorrt/
they say you feed whatever you have ?

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants