You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Nov 16, 2023. It is now read-only.
Hi, I am using one of the notebooks for sequence classification, finetuning a BERT variant. I noticed that the notebook instructions conclude when the model finishes training. I 'd like to ask if there's a straightforward way to save the finetuned model locally, and load it on another machine for inference on new data.
I have already managed to convert from txt/csv to df, convert the df to dataset and then create the dataloader for inference. However, I am unable to load the trained model. What I did to save the finetuned model is: classifier.save_model("./trained_bert_base_classifier.bin")
and tried loading it (unsuccessfully) using the transformers Automodel, the torch.load and Transformer.load_model() from utils.
I would really appreciate some help on how to properly save and load a finetuned model using the existing recipes.