Fictionarry/DNGaussian

Custom Dataset

adksantosh opened this issue · 3 comments

How do I run this on my own scenes, I tried running train_blender with a colmap dataset(convert.py) with "depth_maps" inside for 6000 iters. It produces just a random black blob. Am I missing something? Also, how would I run this without colmap?

Hi, what about run python train_llff.py -s $dataset --model_path $workspace --rand_pcd --eval --iterations 6000 --lambda_dssim 0.2 ? This cancels the --n_sparse option to help check whether your dataset is correct. If everything is ok, the correct rendering views would appear in the $workspace/eval/renders folder. Currently, COLMAP is necessary to estimate the camera poses of each view.

It trains and renders fine but the ply file has a different structure than the original. Is there code for the training loop with harmonics?

You can try to replace render, render_for_depth, render_for_opa with render_sh, render_for_depth_sh, render_for_opa_sh at line 21 and GaussianModel with GaussianModelSH at line 23, and change line 190 to scene.save(iteration) in train_llff.py.

However, I'm not sure if there will be any bugs then. Also, SH could cause some bright spots in rendering, as shown in our supplementary material.