Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Padding Layer #15

Open
wants to merge 8 commits into
base: master
Choose a base branch
from
Open

Padding Layer #15

wants to merge 8 commits into from

Conversation

srk97
Copy link

@srk97 srk97 commented Jul 26, 2018

This layer introduces arbitrary padding as a separate layer instead of having it in convolution. The idea is to allow only fixed padding types for convolution layer such as VALID, SAME, FULL etc. Since arbitrary padding is still required in some architectures, it's been created as a separate layer. The layer takes 4 arguments:

  • Left Pad
  • Right Pad
  • Top Pad
  • Bottom Pad

The expected format for the string is this: PADDING2D|topPad|bottomPad|leftPad|rightPad

This contains the naive implementation of padding. More efficient approaches such as pre-allocation, a single data structure for propagations are yet to be discussed.

TO-DO

  • Tests for padding

@srk97 srk97 requested a review from lmoneta as a code owner July 26, 2018 20:53
rightPad = strRightPad.Atoi();
} break;
}
++idxToken;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a valid string here? Add some information in the comments. ("1, 2, 3 4")? Any caveats the caller should be aware of?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. I don't have any checks for negative numbers in the padding string. Is the check necessary? I'll add the order of padding to the comments nevertheless

@@ -496,6 +496,8 @@ void MethodDL::CreateDeepNet(DNN::TDeepNet<Architecture_t, Layer_t> &deepNet,
} else if (strLayerType == "LSTM") {
Log() << kFATAL << "LSTM Layer is not yet fully implemented" << Endl;
//ParseLstmLayer(deepNet, nets, layerString->GetString(), subDelimiter);
} else if (strLayerType == "PADDING") {
ParseRnnLayer(deepNet, nets, layerString->GetString(), subDelimiter);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Typo, should be ParsePaddingLayer.

/*! Function for adding Padding Layer in the Deep Neural Network, with a given
* top, bottom, left and right paddings. It will take every matrix from the
* previous layer and pad it with zeros to a matrix with new dimensions. */
TPaddingLayer<Architecture_t> *AddPaddingLayer(size_t topPad, size_t bottomPad, size_t leftPad, size_t rightPad);

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also have a factory method for using "full" and "valid" paddings? (without the user need to specify exact dimensions)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we make this a factory method in the Padding Layer itself or have this option only for conv layer?

namespace CNN {

template <typename Architecture_t>
class TPaddingLayer : public VGeneralLayer<Architecture_t>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I understand correctly this is more appropriately named PaddingLayer2D since it assumes 2D spatial layout. (Then we can also integrate 1D and 3D layers later. This is the approach of e.g. keras).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. I'll fix the name

fTopPad(topPad), fBottomPad(bottomPad), fLeftPad(leftPad), fRightPad(rightPad)
{

this->outputHeight = inputHeight + topPad + bottomPad;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Create private functions for this? The calculation can be reused in constructor for options "valid" and "full".

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. I have made this change already. I'll push it soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants