-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function definition and Function implementation does not match #4
Comments
I am curiousl too, since the original paper proposed the Micro-block as: BN-Act-Conv2d. This is also replicated by other pytorch implementation, and also used in DenseNet. |
When BN-Act comes first it adds batch norm and an activation to the input images. At least that's one reason one might write your helper function thus. Obviously when tiled, these layers look similar for most of the network. There's lots of interesting ways to style residual blocks. We may as well be doing it like NAS does it when we design networks. |
Hi thanks Relh. |
@ line 246. The function starts with _bn_relu_conv() but you have implemented as conv --> batch_norm --> relu
The text was updated successfully, but these errors were encountered: