/DRG

[ECCV 2020] DRG: Dual Relation Graph for Human-Object Interaction Detection

Primary LanguagePythonMIT LicenseMIT

DRG: Dual Relation Graph for Human-Object Interaction Detection

Official Pytorch implementation for DRG: Dual Relation Graph for Human-Object Interaction Detection (ECCV 2020).

See the project page for more details. Please contact Jiarui Xu (jiaruixu@vt.edu) if you have any questions related to implementation details.

Prerequisites

This codebase was tested with Python 3.6, Pytorch 1.0 from a nightly release, CUDA 10.0, and CentOS 7.4.1708.

Installation

Please check INSTALL.md for installation instructions.

Data Downloads

Download V-COCO and HICO-DET data. Setup HICO-DET evaluation code.

bash ./scripts/download_dataset.sh 
bash ./scripts/download_data.sh

Evaluation

  1. Download DRG detections and data

    bash ./scripts/download_drg_detection.sh
  2. Evaluate on VCOCO

    python tools/vcoco_compute_mAP.py \
        --dataset_name vcoco_test \
        --detection_file output/VCOCO/detection_merged_human_object_app.pkl
  3. Evaluate on HICO-DET

    cd Data/ho-rcnn
    matlab -r "Generate_detection('COCO'); quit"
    cd ../../
  4. Evaluate on HICO-DET finetuned detection

    cd Data/ho-rcnn
    matlab -r "Generate_detection('finetune'); quit"
    cd ../../

Train

  1. Down pre-trained Faster R-CNN model weights for initialization

    bash ./scripts/download_frcnn.sh
  2. Train on V-COCO

    bash ./scripts/train_VCOCO.sh
  3. Train on HICO-DET

    bash ./scripts/train_HICO.sh

Test

  1. Test on V-COCO

    bash ./scripts/test_VCOCO.sh $APP_ITER_NUMBER $HUMAN_SP_ITER_NUMBER $OBJECT_SP_ITER_NUMBER
  2. Test on HICO-DET

    bash ./scripts/test_HICO.sh $APP_ITER_NUMBER $HUMAN_SP_ITER_NUMBER $OBJECT_SP_ITER_NUMBER

DRG Pretrained Weights

Download DRG trained weights.

bash ./scripts/download_drg_models.sh

Object Detection

For a simple demo, you can try

python demo/demo_obj_det.py

Currently, we only support Faster R-CNN with ResNet-R50-FPN backbone.

TODO

  • Video demo generation code

Citation

If you find this code useful for your research, please consider citing the following papers:

@inproceedings{Gao-ECCV-DRG,
    author    = {Gao, Chen and Xu, Jiarui and Zou, Yuliang and Huang, Jia-Bin}, 
    title     = {DRG: Dual Relation Graph for Human-Object Interaction Detection}, 
    booktitle = {European Conference on Computer Vision},
    year      = {2020}
}

@inproceedings{gao2018ican,
    author    = {Gao, Chen and Zou, Yuliang and Huang, Jia-Bin}, 
    title     = {iCAN: Instance-Centric Attention Network for Human-Object Interaction Detection}, 
    booktitle = {British Machine Vision Conference},
    year      = {2018}
}

Acknowledgement

This code follows the implementation architecture of maskrcnn-benchmark, iCAN and No Frills.