Wanggcong/SparseNeRF

Not able to run custom data

sairisheek opened this issue · 6 comments

Hello, thanks for the great work. I am trying to run mipnerf scenes with ablated views, but for some reason the training renderings look something like this
image
Any idea why this would be the case? Seems like PSNR is high while training, but extremely low in test (the test set and train set are the same images).

Thanks for your question. NeRF is able to interpolate the views, but could be hard to extrapolate views. For example, you use view 1, view 3, view 5 for training while use view 2 and view 4 for testing. If you use view 2,3,4 for training and view 1,5 for testing, it could lead to a bad result.

Moreover, these training views should be overlapped to some extent.

I believe the testing and training views are the same. I think it's caused the by the random rays flag being set to false when evaluating.

Can you show the rendered video?

Yes, I'll try to get that for you. Meanwhile I had fixed some bugs with our camera poses and am getting results like this for a 360 scene.
image
I had set remap_to_hemisphere to true so as to not use NDC. Is there something I'm missing?

Yes, 360-degree is different from NDC. You might carefully handle this problem. Do you directly use our code for 360-degree scenes?

I closed this issue since there are no further questions. You might re-open it or raise a new issue if necessary.