/CGACD

Correlation-Guided Attention for Corner Detection Based Visual Tracking (CVPR2020)

Primary LanguagePythonApache License 2.0Apache-2.0

1. Environment setup

This code has been tested on Ubuntu 18.04, Python 3.7, Pytorch 1.1.0, CUDA 10.0. Please install related libraries before running this code:

pip install -r requirements.txt
python setup.py build_ext --inplace

Add CGACD to your PYTHONPATH

export PYTHONPATH=/path/to/CGACD:$PYTHONPATH

2. Test

Download the pretrained model: OTB and VOT (code: 16s0) and put them into checkpoint directory.

Download testing datasets and put them into dataset directory. Jsons of commonly used datasets can be downloaded from BaiduYun or Google driver. If you want to test the tracker on a new dataset, please refer to pysot-toolkit to set test_dataset.

python tools/test.py                                \
	--dataset VOT2018                      \ # dataset_name
	--model checkpoint/CGACD_VOT.pth  \ # tracker_name
    --save_name CGACD_VOT

The testing result will be saved in the results/dataset_name/tracker_name directory.

3. Train

Prepare training datasets

Download the datasets:

Scripts to prepare training dataset are listed in training_dataset directory.

Download pretrained backbones

Download pretrained backbones from google driver or BaiduYun (code: 5o1d) and put them into pretrained_net directory.

Train a model

To train the CGACD model, run train.py with the desired configs:

python tools/train.py 
    --config=experiments/cgacd_resnet/cgacd_resnet.yml \
    -b 64 \
    -j 16 \
    --save_name cgacd_resnet

We use two RTX2080TI for training.

4. Evaluation

We provide the tracking results (code: qw69 ) of OTB2015, VOT2018, UAV123, and LaSOT. If you want to evaluate the tracker, please put those results into results directory.

python eval.py 	                          \
	-p ./results          \ # result path
	-d VOT2018             \ # dataset_name
	-t CGACD_VOT   # tracker_name

5. Acknowledgement

The code is implemented based on pysot and PreciseRoIPooling. We would like to express our sincere thanks to the contributors.

6. Cite

If you use CGACD in your work please cite our paper:

@InProceedings{Du_2020_CVPR,
author = {Du, Fei and Liu, Peng and Zhao, Wei and Tang, Xianglong},
title = {Correlation-Guided Attention for Corner Detection Based Visual Tracking},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2020}
}