The code for the paper Multi-Human Parsing Machines
The dataset used in this repo is MHP-v1, which can be downloaded from our LV-MHP website.
Put the downloaded data under folder
./data
and unzip it to generate two subfolders: images and annotations. Run the scripts
generate_global_seg.py
generate_global_tag.py
inside ./data to generate the global segmentation maps and global tags.
Models for deployment are here
Pre-trained models are here
Deeplab v2 in pytorch: https://github.com/speedinghzl/Pytorch-Deeplab
fastSceneUnderstanding: https://github.com/DavyNeven/fastSceneUnderstanding
Dense CRF: https://github.com/lucasb-eyer/pydensecrf