Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unused Weights When Loading Model #108

Open
AshOlogn opened this issue Oct 31, 2020 · 0 comments
Open

Unused Weights When Loading Model #108

AshOlogn opened this issue Oct 31, 2020 · 0 comments

Comments

@AshOlogn
Copy link

AshOlogn commented Oct 31, 2020

I am using the newest version of Transformers (i.e. version 3.0) and try to load a Scibert model using the following code

`scibert_tokenizer = AutoTokenizer.from_pretrained('allenai/scibert_scivocab_uncased')

scibert_model = BertForMaskedLM.from_pretrained('allenai/scibert_scivocab_uncased')
`

When I do this I get the following message indicating unused weights:

`Some weights of the model checkpoint at allenai/scibert_scivocab_uncased were not used when initializing BertForMaskedLM: ['cls.seq_relationship.weight', 'cls.seq_relationship.bias']

This IS expected if you are initializing BertForMaskedLM from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPretraining model).
This IS NOT expected if you are initializing BertForMaskedLM from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).`

I do not get a similar error when using an earlier version of Transformers (i.e. 2.something).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant