Totoro97/f2-nerf

export point cloud in orginal space

Bin-ze opened this issue · 7 comments

Bin-ze commented

I try to use f2-nerf to export point cloud, the method is very simple, just map the depth value into 3d space:

Tensor point = rays_o.to(torch::kCPU) + rays_d.to(torch::kCPU) * pred_depth;

but I found that the exported point cloud has some distortion problems, which is also normal, if the sample is trained in warping space , how do I map the point cloud to Euclidean space?

Bin-ze commented

I have another question, what is the range of the point cloud prediction result? Is it in a [-1,1] box?

so did you manage to predict depth ? or do you mean *pred_disps?

and could you please elaborate on what other changes we need to make to get the point cloud, what file we need to check?

Bin-ze commented

yes, f2-nerf can predict depth , so can export point cloud, but depth in warping space(), you can check these:
https://github.com/nerfstudio-project/nerfstudio/blob/main/nerfstudio/exporter/exporter_utils.py
Another point is that how to filter points that are far away needs to be considered, which affects the quality of the point cloud

Bin-ze commented

@Totoro97 shuaige, could you tell me how to export point cloud?

@Bin-ze I'm having exactly the same issue. Generated disparity has issues in distortion, Have you managed to solve that?

@龙猫97帅哥,你能告诉我如何导出点云吗?

Hi,Have you found the method of exporting point cloud? How to operate it specifically?