Skip to content

ZizhenWang/CS224n-2019

 
 

Repository files navigation

seq2seq(attention)模型可参考

实现了一揽子模型

  • 1、NNLM(经典中的经典了)

  • 2、Word2Vec

  • 3、TextCNN

  • 4、TextRNN

  • 5、Seq2Seq

  • 6、Seq2Seq(Attention)

  • 7、BiLSTM

  • 8、BiLSTM(Attention)

About

Stanford cs224n 2019 homework

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.5%
  • Python 0.5%