Skip to content

Implementation of DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

Notifications You must be signed in to change notification settings

sankn123/nlp_course_project

Repository files navigation

NLP Course Project

Implementation of DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

  • Three ipynb files for the three datasets mentioned in the paper.
  • BERT_RTE.py is to train RTE dataset, CustomInLawBert_RR.py, DistilBert_RR.py are to train rhetorical role predicition dataset.
  • Json Files indicates the Rhetorical Role prediction dataset.

The KD code is in a different repo-

https://github.com/sankn123/KD_Lib

About

Implementation of DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages