Code for Real-time 'Actor-Critic' tracking accepted by ECCV 2018
We propose a novel tracking algorithm with real-time performance based on the ‘Actor-Critic’ framework.
Results on OTB100
- Tensorflow 1.4.0 (Train) and Pytorch 0.3.0 (Test)
- CUDA 8.0 and cuDNN 6.0
- Python 2.7
- Please download the
ILSVRC VID dataset
, and put theVID
folder into$(ACT_root)/train/
(We adopt the same videos as meta_trackers. You can find more details inilsvrc_train.json
.) - Run the
$(ACT_root)/train/DDPG_train.py
to train the 'Actor and Critic' network.
Please run $(ACT_root)/tracking/run_tracker.py
for demo.
If you find ACT useful in your research, please kindly cite our paper:
@InProceedings{Chen_2018_ECCV,
author = {Chen, Boyu and Wang, Dong and Li, Peixia and Wang, Shuang and Lu, Huchuan},
title = {Real-time 'Actor-Critic' Tracking},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}
If you have any questions, please feel free to contact bychen@mail.dlut.edu.cn
Many parts of this code are adopted from other related works (py-MDNet and meta_trackers)