Skip to content

Conversation

@Ildar-Gaisin
Copy link

Description of changes:

In the BERT Encoder model, the token embeddings should be mulitplied by the square root of the embedding dimension before being summed up with segment embeddings and positional embeddings. If I am not mistaken, the original BERT model should do this multiplication, however I could not find the place where they do it in their github https://github.com/google-research/bert.

By submitting this pull request, I confirm that you can use, modify,
copy, and redistribute this contribution, under the terms of your
choice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant