Text Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Paper Link: http://www.aclweb.org/anthology/D18-1355
https://drive.google.com/open?id=132Jlza-16Ws1DJ7h4O89TyxJiFSFAPw7
https://drive.google.com/open?id=16gO8cLXttGR64_xvLHgMwgJeB1DzT93N
python model/train.py -ngpus 1 -bsize 64 -fw transformer -out bertal_wkori_direct -op adagrad -lr 0.01 --mode transbert_ori -nh 8 -nhl 6 -nel 6 -ndl 6 -lc True -eval_freq 0 --fetch_mode tf_example_dataset --subword_vocab_size 0 --dmode wk --tie_embedding all --bert_mode bert_token:bertbase:init --environment aws --memory direct python model/eval.py -ngpus 1 -bsize 256 -fw transformer -out bertal_wkori_direct -op adagrad -lr 0.01 --mode transbert_ori -nh 8 -nhl 6 -nel 6 -ndl 6 -lc True -eval_freq 0 --subword_vocab_size 0 --dmode wk --tie_embedding all --bert_mode bert_token:bertbase:init --environment aws
More config you can check them in util/arguments.py
Zhao, Sanqiang, et al. "Integrating Transformer and Paraphrase Rules for Sentence Simplification." Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018.
@article{zhao2018integrating,
title={Integrating Transformer and Paraphrase Rules for Sentence Simplification},
author={Zhao, Sanqiang and Meng, Rui and He, Daqing and Andi, Saptono and Bambang, Parmanto},
journal={arXiv preprint arXiv:1810.11193},
year={2018}
}