/RepSurf

(CVPR 2022 Oral) Official implementation for "Surface Representation for Point Clouds"

Primary LanguagePythonApache License 2.0Apache-2.0

RepSurf - Surface Representation for Point Clouds
[CVPR 2022 Oral]

By Haoxi Ran* , Jun Liu, Chengjie Wang ( * : corresponding contact)

PWC PWC PWC PWC PWC PWC

The pytorch official implementation of "Surface Representation for Point Clouds". The repos of RepSurf on other tasks are coming:

arXiv | PDF

Other Tasks:

3D Segmentation (ongoing): RepSurf for Segmentation

Preparation

Environment

We tested under the environment:

  • python 3.7
  • pytorch 1.6.0
  • cuda 10.1
  • gcc 7.2.0
  • h5py

For anaconda user, initialize the environment by:

sh init.sh

Or you can manually install the above packages and compile the cuda-based point operators by:

cd modules/pointops
python3 setup.py install

Classification

ScanObjectNN

  • Performance:
Model Accuracy #Params Augment Code Log Checkpoint
MVTN 82.8 4.24M None link N/A link
PointMLP 85.7 12.6M Scale, Shift link link link
PointNet++ SSG 77.9 1.475M Rotate, Jitter link N/A N/A
Umbrella RepSurf (PointNet++ SSG) 84.87 1.483M None link google drive google drive (6MB)
Umbrella RepSurf (PointNet++ SSG, 2x) 86.05 6.806M None link google drive google drive (27MB)

  • To download dataset:
wget http://download.cs.stanford.edu/orion/scanobjectnn/h5_files.zip
unzip h5_files.zip
ln -s [PATH]/h5_files data/ScanObjectNN

Note: We conduct all experiments on the hardest variant of ScanObjectNN (PB_T50_RS).

  • To train Umbrella RepSurf on ScanObjectNN:
sh scripts/repsurf/scanobjectnn/repsurf_ssg_umb.sh
  • To train Umbrella RepSurf (2x setting) on ScanObjectNN:
sh scripts/repsurf/scanobjectnn/repsurf_ssg_umb_2x.sh

Visualization

We provide several visualization results in the folder ./visualization for a closer look at the construction of RepSurf.

TODO

  • Classification on ModelNet40
  • Segmentation on S3DIS / ScanNet

Acknowledgment

We use part of the library pointops from PointWeb.

License

RepSurf is under the Apache-2.0 license. Please contact the primary author Haoxi Ran (ranhaoxi@gmail.com) for commercial use.