Skip to content

Latest commit

 

History

History
7 lines (7 loc) · 451 Bytes

README.md

File metadata and controls

7 lines (7 loc) · 451 Bytes

NLP Course Project

Implementation of DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter

  • Three ipynb files for the three datasets mentioned in the paper.
  • BERT_RTE.py is to train RTE dataset, CustomInLawBert_RR.py, DistilBert_RR.py are to train rhetorical role predicition dataset.
  • Json Files indicates the Rhetorical Role prediction dataset.

The KD code is in a different repo-

https://github.com/sankn123/KD_Lib