YanjieZe/GNFactor

Loss_balance

Closed this issue · 5 comments

Hi,

I noticed that in this code you set the weights to 1 and 0.01. Does this mean that the rendering loss plays a very small role?

total_loss = lambda_BC * total_loss + lambda_nerf * rendering_loss_dict['loss']

In fact, in my task, the total_loss is about 20, while the rendering_loss is very small, such as 1.12.

They themselves differed by an order of magnitude. If multiplied by 0.01, it would be a difference of three orders of magnitude. Is that so?

Alternatively, can you tell me what the optimal ratio is? This saves me some trial and error time. Thanks very much.

Hi,

We tried 1, 0.1, 0.01 in previous experiments and find that 0.01 is slightly better, but not significantly. Therefore the "optimal ratio" from these experiments is 0.01.

If you have enough computation, it is encouraged to try more weights beside these.

OK, thank you very much for your reply!

Hi,

Sorry to bother you again. I want to know how you determine the direction vector _coord_trans ? I can't reproduce image that reconstructed based on stable diffusion&RGB under the new robot benchmark.

self._coord_trans = torch.diag(torch.tensor([1, -1, -1, 1], dtype=torch.float32)).to(device)

gt_pose = nerf_target_pose @ self._coord_trans # remember to do this

Hi, this transformation is to align the coordinate system. I think in our implementation, cameras/algorithms follow the opengl/opencv tradition, and then we need to align them. In your case, make sure the coordinate systems are aligned. You could try to remove this transformation to see if it works better.

Thank you so much! I'll try it. o( ̄▽ ̄)ブ