/stylemesh

StyleMesh optimizes a stylized texture for an indoor scene reconstruction (CVPR2022).

Primary LanguageC++MIT LicenseMIT

StyleMesh (CVPR2022)

This is the official repository that contains source code for the CVPR2022 paper StyleMesh.

[Arxiv] [Project Page] [Video]

Teaser

If you find StyleMesh useful for your work please cite:

@inproceedings{hollein2022stylemesh,
  title={StyleMesh: Style Transfer for Indoor 3D Scene Reconstructions},
  author={H{\"o}llein, Lukas and Johnson, Justin and Nie{\ss}ner, Matthias},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={6198--6208},
  year={2022}
}

Preprocessing

The following steps are necessary to prepare all data.

Data Setup

Project Setup

Texture Optimization

The following steps allow to optimize a texture for a specific scene/style. You can easily select scene/style by modifying the corresponding values in the scripts (--scene for ScanNet and additionally --region for Matterport). It also allows to fine-tune loss weights, if you want to experiment with your own settings.

All style images that are used in the main paper are listed in styles.

By default, run files (texture, hparams, logging) are saved in style-mesh/lightning_logs

The suffix "with_angle_and_depth" is used for comparisons in Fig. 4,5,6,7,8,9,11. The suffixes "only2D" and "with_angle" are used for ablation study in Fig. 7. The suffix "dip" is used for the DIP baseline in Fig. 4,5,6

Render optimized Texture

You can render images with Mipmapping and Shading with our OpenGL renderers for each dataset. Alternatively, you can use the generated texture files after each optimization together with the generated meshes and view the textured mesh in any mesh viewer (e.g. Meshlab or Blender).

Evaluate Reprojection Error

We use the file scripts/eval/eval_image_folders.py for calculation of the reprojection error (Tab. 1)

Evaluate Circle Metric

We use the file scripts/eval/measure_circles.py for calculation of the circle's metric (Tab. 2, Fig. 8)