/PrimFit

The code of paper "Structure–Aware Surface Reconstruction via Primitive Assembly" (ICCV2023)

Primary LanguageCGNU General Public License v3.0GPL-3.0

PrimFit

PrimFit is the implementation of reconstruction method described in paper “Structure–Aware Surface Reconstruction via Primitive Assembly”.

I will release the code before July 2024.I apologize for the delay.

In this paper, we extend the hypothesis and selection idea of plane-baed surface reconstruction method, PolyFit , to multi-type primitives and introduce an effective pruning mechanism to speed up extraction step (selection) significantly.

Code

Currently, I am refactoring the code, and it will be available soon. I will upload all the code during winter vacation.

Citation

If you make use of our work, please cite our paper:

@inproceedings{Jiang2023primfit,
  title={Structure–Aware Surface Reconstruction via Primitive Assembly},
  author={Jingen Jiang, Mingyang Zhao, Shiqing Xin, Yanchao Yang, Hanxiao Wang, Xiaohong Jia, Dong-Ming Yan},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  year={2023}
}

Acknowledgements

This work was partially funded by the National Key Research and Development Program (2021YFB1715900), the CAS Project for Young Scientists in Basic Research (YSBR-034), the National Natural Science Foundation of China (62172415, 62272277, 12022117), and the HKU-100 Research Award.

Our code is inspired the works of BSH, PolyFit and KSR. We would like to thank Dr. Xingyi Du and Prof. Liangliang Nan for their excellent code.

Furthermore, we are grateful to Jiahui Lv from Shenzhen University for his valuable advice in this work.

Maintaince

If any problem, please contact me via xiaowuga@gmail.com.