-
Install Conda and create a conda environment.
conda create -n ounet python=3.10 conda activate ounet
-
Intall PyTorch-2.1.0 with conda according to the official documentation.
conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia
-
Install the requirements.
pip install -r requirements.txt # For evaluation only conda install -c conda-forge point_cloud_utils==0.18.0
The official access addresses of the public data sets are as follows: PU-GAN, Sketchfab, PU1K, PUNet.
Place and unzip them into folder original_dataset
. Run the following commands to prepare dataset.
bash tools/prepare_dataset.sh
We trained our network on aforementioned four datasets, please download the trained weight via Google Drive or Baidu Netdisk, and please it in the folder logs/puc/checkpoints
.
Run the following commands to train the network by 4 GPUs. The log and trained model will be saved in the folder logs/upsample-clean
.
python main.py --config=configs/upsample-clean.yaml
Run the following commands to generate upsampled and cleaned point clouds, which will be saved in the folder logs/puc/model_outputs
.
python main.py --config=configs/upsample-clean.yaml SOLVER.run evaluate
Run the following commands to evaluate the upsampling results using CD, HD, and P2F. The dataset
includes PU-GAN
, Sketchfab
, PU1K
.
python evaluate.py --outputdir=logs/upsample-clean/model_outputs/upsampling/<dataset> --dataset=<dataset>
Run the following commands to evaluate the cleaning results using CD, HD, and P2F. The resolution
includes 10k
and 50k
, and the noise level
includes 1
, 2
, 25
.
python evaluate.py --outputdir=logs/upsample-clean/model_outputs/cleaning/<resolution>/noise_<noise level> --dataset=PUNet_<resolution>