Recurrent Back Propagation, Back Propagation Through Optimization, ICML 2018
This is the PyTorch implementation of Recurrent Back Propagation as described in the following ICML 2018 paper:
@article{liao2018reviving,
title={Reviving and Improving Recurrent Back-Propagation},
author={Liao, Renjie and Xiong, Yuwen and Fetaya, Ethan and Zhang, Lisa and Yoon, KiJung and Pitkow, Xaq and Urtasun, Raquel and Zemel, Richard},
journal={arXiv preprint arXiv:1803.06396},
year={2018}
}
To set up experiments, we need to build our customized operators by running the following scripts:
./setup.sh
Python 3, PyTorch(0.4.0)
To run experiments X
where X
is one of {hopfield
, cora
, pubmed
, hypergrad
}:
python run_exp.py -c config/X.yaml
Notes:
grad_method
in the config file.model/rbp.py
for more details.Please cite our paper if you use this code in your research work.
Please submit a Github issue or contact [email protected] if you have any questions or find any bugs.