My seq2seq based on tensorflow
This is a project to learn to implement different s2s model on tensorflow.
This project is only used for learning, which means it will contain many bugs. I suggest to use nmt project to do experiments and train seq2seq models. You can find it in the reference part.
I am experimenting the copynet and pg on lcsts dataset, you can find the code in the lcsts branch.
Issues and suggestions are welcomed.
The models I have implemented are as following:
For the implement details, refer to ReadMe in the model folder.
A typical sequence to sequence(seq2seq) model contains an encoder, an decoder and an attetion structure. Tensorflow provide many useful apis to implement a seq2seq model, usually you will need belowing apis:
Use either:
Right now I only have cross entropy loss. Will add following metrics:
Run the model on a toy dataset, ie. reverse the sequence
train:
python -m bin.toy_train
inference:
python -m bin.toy_inference
Also you can run on en-vi dataset, refer to en_vietnam_train.py in bin for more details.
You can find more training scripts in bin directory.
Thanks to following resources: