/CityPersons

bounding box annotations, python evaluation code, and a benchmark for CityPersons

Primary LanguagePython

README

This repo provides bounding box annotations, python evaluation code, and a benchmark for CityPersons, which is a subset of the Cityscapes dataset. Please download the images from the Cityscapes website!

Welcome to join the competition by submitting your results on the test set!

You are highly encouraged to use your institutional email account for submission!

Benchmark

Method External training data MR (Reasonable) MR (Reasonable_small) MR (Reasonable_occ=heavy) MR (All)
DIW Loss 6.23% 7.36% 28.37% 26.45%
LSFM 6.38% 7.90% 24.73% 31.36%
APD-pretrain 7.31% 10.81% 28.07% 32.71%
Pedestron 7.69% 9.16% 27.08% 28.33%
APD × 8.27% 11.03% 35.45% 35.65%
YT-PedDet × 8.41% 10.60% 37.88% 37.22%
STNet × 8.92% 11.13% 34.31% 29.54%
MGAN × 9.29% 11.38% 40.97% 38.86%
DVRNet × 11.17% 15.62% 42.52% 40.99%
HBA-RCNN × 11.26% 15.68% 39.54% 38.77%
OR-CNN × 11.32% 14.19% 51.43% 40.19%
AdaptiveNMS × 11.40% 13.64% 46.99% 38.89%
Repultion Loss × 11.48% 15.67% 52.59% 39.17%
Cascade MS-CNN × 11.62% 13.64% 47.14% 37.63%
Adapted FasterRCNN × 12.97% 37.24% 50.47% 43.86%
MS-CNN × 13.32% 15.86% 51.88% 39.94%

Please refer to the instructions on submitting results for evaluation.

What Do We Have?

Annotation Example

图片1.png

Citation

If you use this data and code, please kindly cite the following papers:

@INPROCEEDINGS{Shanshan2017CVPR,

  Author = {Shanshan Zhang and Rodrigo Benenson and Bernt Schiele},

  Title = {CityPersons: A Diverse Dataset for Pedestrian Detection},

  Booktitle = {CVPR},

  Year = {2017}
 }

@INPROCEEDINGS{Cordts2016Cityscapes,

title={The Cityscapes Dataset for Semantic Urban Scene Understanding},

author={Cordts, Marius and Omran, Mohamed and Ramos, Sebastian and Rehfeld, Timo and Enzweiler, Markus and Benenson, Rodrigo and Franke, Uwe and Roth, Stefan and Schiele, Bernt},

booktitle={Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},

year={2016}
}

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.