-
Notifications
You must be signed in to change notification settings - Fork 10
Tracking
Tanja edited this page Nov 17, 2015
·
35 revisions
- Read about neural language models in
- A Neural Probabilistic Language Model
- Learning simultaneously: Distributed Word Vector Representations and a Statistical Language Model (given all previous words, what's the probability distribution for the next word?)
- Shallow model, but can process large datasets
- N previous Words are translated via 1-V-Mapping
- Efficient Estimation of Word Representations in Vector Space
- Read https://code.google.com/p/word2vec/
- A Neural Probabilistic Language Model
- Read through
distance.c
in word2vec to understand word2vec binary format - Write Python program to read word vectors with Joseph
- Set up virtual python environment on server
- Basic network, and experiment infrastructure
- Write glue code connecting sentence files, sliding window and level db creation
- Create databases, run first experiment on server
- Caffe Multi-class Precision and Recall
- Read https://code.google.com/p/word2vec/
- Understand
distance.c
, implement Python program to read word vectors - Explore how to use NLTK for POS tagging
- Read https://code.google.com/p/word2vec/
- Raad some papers
- Write Python script to parse xml and ASR transcript files
- Write Python script to create basic training instances using a sliding window
- Write training instances to leveldb script
- Ensure valid train and test split
- Caffe Multi-class Precision and Recall
- Pipeline work
- Use POS-Tags as features
- Read https://code.google.com/p/word2vec/
- Read papers to get familiar with Deep Learning
- Write Python script to parse xml and ASR transcript files
- Write Python script to create basic training instances using a sliding window
- Refactor script for creating trainings instances and work on pipeline to create instances
- Write python script for demo show cases