Could you provide the pretrained depth net on Matterport3D?
cwchenwang opened this issue · 2 comments
Here is a model trained on Matterport3D.
validation houses: yqstnuAEVhm
, Z6MFQCViBuw
, ZMojNkEp431
, zsNo4HB9uLZ
, XcA2TqTSSAj
test houses: VzqfbhrpDEA
, Vvot9Ly1tCj
, YFuZgdQ5vWj
, YVUC4YcDtcY
, YmJkqBEsHnH
training houses: all remaining houses
The network input, RGB and sparse depth, are normalized in the same way as for ScanNet:
https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L683
https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L670
And the network output, depth and uncertainty, is converted back to meters like:
https://github.com/barbararoessle/dense_depth_priors_nerf/blob/master/run_nerf.py#L686
For best performance, the depth prior training should consider the scenario that you plan to use the depth priors in (e.g. sparse depth density and accuracy). Details can be found here
@barbararoessle You should provide pretrain models and the train / test split on matterport3d for future works to compare.