SiamFC tracking in TensorFlow.
TensorFlow port of the tracking method described in the paper Fully-Convolutional Siamese nets for object tracking.
In particular, it is the improved version presented as baseline in End-to-end representation learning for Correlation Filter based tracking, which achieves state-of-the-art performance at high framerate. The other methods presented in the paper (similar performance, shallower network) haven't been ported yet.
Note1: results should be similar (i.e. slightly better or worse) than our MatConvNet implementation. However, for direct comparison please refer to the precomputed results available in the project pages or to the original code, which you can find pinned in my GitHub.
Note2: at the moment this code only allows to use a pretrained net in forward mode.
pip install virtualenv
virtualenv --python=/usr/bin/python2.7 ve-tracking
source ~/tracking-ve/bin/activate
git clone https://github.com/torrvision/siamfc-tf.git
cd siamfc-tf
sudo pip install -r requirements.txt
mkdir pretrained data
pretrained
and unzip the archive (we will only use baseline-conv5_e55.mat
)data
and unzip the archive.video
from parameters.evaluation
to "all"
or to a specific sequence (e.g. "vot2016_ball1"
)parameters/hyperparameters.json
parameters/run.json
python run_tracker_evaluation.py
If you find our work useful, please consider citing
↓ [Original method] ↓
@inproceedings{bertinetto2016fully,
title={Fully-Convolutional Siamese Networks for Object Tracking},
author={Bertinetto, Luca and Valmadre, Jack and Henriques, Jo{\~a}o F and Vedaldi, Andrea and Torr, Philip H S},
booktitle={ECCV 2016 Workshops},
pages={850--865},
year={2016}
}
↓ [Improved method and evaluation] ↓
@article{valmadre2017end,
title={End-to-end representation learning for Correlation Filter based tracking},
author={Valmadre, Jack and Bertinetto, Luca and Henriques, Jo{\~a}o F and Vedaldi, Andrea and Torr, Philip HS},
journal={arXiv preprint arXiv:1704.06036},
year={2017}
}
This code can be freely used for personal, academic, or educational purposes. Please contact us for commercial use.