This repository contains the source code for the ECCV 2020 paper Modeling 3D Shapes by Reinforcement Learning, where we made an initial attempt to model 3D shapes like human modelers using deep reinforcement learning (DRL).
cd demo
python prim_agent_demo.py
python mesh_agent_demo.py
This demo takes as input an RGB image for reference. The Prim-Agent will generate a primitive-based representation and create a folder prim_result/
to save some intermediate results. The Prim-Agent will load the primitives saved in prim_result/
to edit its mesh; some intermediate results will be saved in mesh_result/
.
You need to install PyTorch, NumPy and SciPy. This code is tested under Python 3.7.4, PyTorch 1.3.0, NumPy 1.17.2 and SciPy 1.3.1 on Ubuntu 18.04.4.
You may optionally install TensorboardX to visualize the training. The repository contains a part of the code from binvox.
- Train Prim-Agent first
cd Prim-Agent
python train.py
- Then use the trained Prim-Agent to generate primitives and edge loop files for all the data
python generate_edgeloop.py
- Train Mesh-Agent using the output of Prim-Agent
cd Mesh-Agent
python train.py
- Will need to provide paths to the training data and saving results & logs when calling.
- Can change the setting by modifying the parameters in
Prim-Agent/config.py
orMesh-Agent/config.py
- Call
Prim-Agent/test.py
andMesh-Agent/test.py
for testing. Will need to provide paths to the data and the pre-trained model when calling.
- Data data.zip
- Pre-trained model pretrained.zip
- Unzip the downloaded files to replace the
data
andpretrained
folders; then you can directly run the code without modifying the arguments when callingtrain.py
andtest.py
.
If you find our work useful in your research, please consider citing:
@inproceedings{lin2020modeling,
title={Modeling 3d shapes by reinforcement learning},
author={Lin, Cheng and Fan, Tingxiang and Wang, Wenping and Nie{\ss}ner, Matthias},
booktitle={European Conference on Computer Vision},
pages={545--561},
year={2020},
organization={Springer}
}
If you have any questions, please email Cheng Lin at chlin@hku.hk.