hjxwhy/mipnerf_pl

Collaboration

theFilipko opened this issue · 8 comments

Hello, thanks for sharing this implementation, it is awesome.

I am also implementing this and "mip-nerf 360" into my codebase. Would you like to work on it together?

I had a look at your latest commit and actually this

far_inv = 1 / far
near_inv = 1 / near
t_samples = far_inv * t_samples + (1 - t_samples) * near_inv
t_samples = 1 / t_samples

equals to this
t_samples = 1. / (1. / near * (1. - t_samples) + 1. / far * t_samples)

@theFilipko hey, thanks for your advice! It's my pleasure, Have you finished mip-nerf 360?

I am in the middle of it. Please, contact me here filip.hendrichovsky@gmail.com

Hello, have you finished mip-nerf 360? I look forward to your implementation.

@HangXiao-97 if you would like to participate in the implementation, contact me on that email :)

Hey! I am wondering if the mip-nerf 360 implementation is complete?

If not and if functionality and results have changed, can you refer to a functioning mip nerf from a previous commit?

Thanks!

Hey, thanks for reaching out. There has been a code release https://github.com/google-research/multinerf Check this out

Hey, I have checked that out but unfortunately, it's in jax, and flowing gradients between our pytorch framework and jax is an extra overhead and causes a lot of issues which is why your framework is very useful. In either case, It seems that your implementation works for mipnerf right? i can implement the mip-nerf360 on top of it if the base works

@ktiwary2 Hi, thanks for your attention. I still have not finished mipnerf360, because I am debugging Block-NeRF so I have no time to modify jax code to pytorch. You can implement mipnerf360 base on this code.