/EWTH_Loss

The implementation of the NeurIPS2020 paper: The Dilemma of TriHard Loss and an Element-Weighted TriHard Loss for Person Re-Identification

Primary LanguagePython

The Dilemma of TriHard Loss and an Element-Weighted TriHard Loss for Person Re-Identification

All the proposed losses in this paper are implemented on Bag of Tricks and A Strong Baseline for Deep Person Re-identification (BoT) and Deep Learning for Person Re-identification: A Survey and Outlook (AGW).

Requirements

See README of BoT and README of AGW for requirements.

Training

The models in this paper are trained on BoT and AGW with ResNet50 as the backbone. But a few modifications are needed to make to the baselines.

Bag of Tricks (BoT)

All the experiments are conducted without center loss, which is optional in BoT. All the required files are in folder BoT.

To train the network with losses in this paper, replace baseline.py in reid-strong-baseline/modeling/ with baseline.py trainer.py in reid-strong-baseline/engine/ with trainer.py __init__.py and triplet_loss.py in reid-strong-baseline/layers/ with __init__.py and triplet_loss.py.

You can train the network with different losses proposed in this paper, which can be changes in triplet_loss.py. The default loss is HNEWTH in triplet_loss.py. Margins in EWTH loss can be changed in reid-strong-baseline/configs/softmax_triplet.yml and triplet_loss.py. The hyper parameter $t$ can be changed in triplet_loss.py. Other settings of BoT are default and other informations of training are available in reid-strong-baseline/README.md.

AGW

All the required files are in folder AGW.

To train the network with losses in this paper, turn WEIGHT_REGULARIZED_TRIPLET into "off" in ReID-Survey/configs/AGW_baseline.yml and replace triplet_loss.py in ReID-Survey/modeling/layer/ with triplet_loss.py baseline.py in ReID-Survey/modeling/ with baseline.py.

You can train the network with different losses proposed in this paper, which can be changes in triplet_loss.py. The default loss is HNEWTH in triplet_loss.py. Margins in EWTH loss can be changed in ReID-Survey/config/defaults.py and triplet_loss.py. The hyper parameter $t$ can be changed in triplet_loss.py. Other settings of AGW are default and other informations of training are available in ReID-Survey/README.md.

Evaluation

See README of BoT and README of AGW.

Pre-trained Models

The pre-trained ResNet50 model will be downloaded to the specified locations in BoT and AGW. See README of BoT and README of AGW for further informations.

Results

In AGW+HNTH, $\alpha_1=0.1$ and in AGW+EWTH, $\alpha_2=0.2$. All the other margins are default values 0.3.

<style type="text/css"> .tg {border-collapse:collapse;border-spacing:0;} .tg td{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px; overflow:hidden;padding:10px 5px;word-break:normal;} .tg th{border-color:black;border-style:solid;border-width:1px;font-family:Arial, sans-serif;font-size:14px; font-weight:normal;overflow:hidden;padding:10px 5px;word-break:normal;} .tg .tg-c3ow{border-color:inherit;text-align:center;vertical-align:top} </style>
Method Market1501 MSMT17 Method Market1501 MSMT17
mAP rank-1 mAP rank-1 mAP rank-1 mAP rank-1
BoT+TH 85.6% 94.1% 45.1% 63.9% AGW+TH 87.7% 95.0% 48.4% 67.9%
BoT+HTH 86.6% 94.6% 45.0% 63.9% AGW+HTH 88.1% 95.4% 48.1% 67.6%
BoT+EWTH 87.7% 95.0% 48.7% 67.8% AGW+EWTH 88.5% 95.4% 50.4% 69.6%
BoT+NEWTH 88.4% 95.1% 49.7% 68.1% AGW+NEWTH 89.4% 95.6% 53.1% 71.5%
BoT+TH+FN 86.3% 94.1% 45.2% 63.8% AGW+TH+FN 88.0% 95.1% 47.7% 66.3%