Bert-based models(BERT, MTB, CP) for relation extraction.
Dataset and code for Learning from Context or Names? An Empirical Study on Neural Relation Extraction.
If you use this code, please cite us
@article{peng2020learning,
title={Learning from Context or Names? An Empirical Study on Neural Relation Extraction},
author={Peng, Hao and Gao, Tianyu and Han, Xu and Lin, Yankai and Li, Peng and Liu, Zhiyuan and Sun, Maosong and Zhou, Jie},
journal={arXiv preprint arXiv:2010.01923},
year={2020}
}
You can quickly run our code by following steps:
pretrain
or finetune
directory then download and pre-processing data for pre-traing or finetuning.Run the following script to install dependencies.
pip install -r requirement.txt
You need install transformers and apex manually.
transformers
We use huggingface transformers to implement Bert. And for convenience, we have downloaded transformers into utils/
. And we have also modified some lines in the class BertForMaskedLM
in src/transformers/modeling_bert.py
while keep the other codes unchanged.
You just need run
pip install .
to install transformers manually.
apex Install apex under the offical guidance.
You can cd to pretrain
or finetune
to learn more details about pre-training or finetuning.