Lixin4ever HAST Save

Aspect Term Extraction with History Attention and Selective Transformation (IJCAI 2018)

Project README

HAST

Aspect Term Extraction with History Attention and Selective Transformation.

Requirements

  • Python 3.6
  • DyNet 2.0.2 (For building DyNet and enabling the python bindings, please follow the instructions in this link)
  • nltk 3.2.2
  • numpy 1.13.3

External Linguistic Resources

Preprocessing

  • Window-based input (window size is 3, as done in Pengfei's work).
  • Replacing the punctuations with the same token PUNCT.
  • Only the sentimental words with strong subjectivity are employed to provide distant supervision.

Running

python main.py -ds_name [YOUR_DATASET_NAME] -sgd_lr [YOUR_LEARNING_RATE_FOR_SGD] -win [YOUR_WINDOW_SIZE] -optimizer [YOUR_OPTIMIZER] -rnn_type [LSTM|GRU] -attention_type [bilinear|concat]

Environment

  • OS: REHL Server 6.4 (Santiago)
  • CPU: Intel Xeon CPU E5-2620 (Yes, we do not use GPU)

Citation

If the code is used in your research, please star this repo and cite our paper as follows:

@inproceedings{li2018aspect,
  title={Aspect Term Extraction with History Attention and Selective Transformation},
  author={Li, Xin and Bing, Lidong and Li, Piji and Lam, Wai and Yang, Zhimou},
  booktitle={IJCAI},
  pages={4194--4200}
  year={2018}
}
Open Source Agenda is not affiliated with "Lixin4ever HAST" Project. README Source: lixin4ever/HAST

Open Source Agenda Badge

Open Source Agenda Rating