/PhysParamInference

Clone of the WACV2023 paper. Adaptation on pouring water.

Primary LanguagePythonMIT LicenseMIT

Neural Implicit Representations for Physical Parameter Inference from a Single Video

Florian Hofherr1Lukas Koestler1Florian Bernard2Daniel Cremers1

1Technical University of Munich    2University of Bonn

IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2023

arXiv | Project Page

Getting Started

You can create an anaconda environment called physParamInference with all the required dependencies by using

conda env create -f environment.yml
conda activate physParamInference

You can download the data using

bash download_data.sh

The script downloads all data used in the paper and stores them into a /data/ folder.

Usage

Training

The training for the different scenarios is run by python training_***.py. The parameters for each scenario are defined in the respective config file in the /configs/ folder.

The results, including checkpoints, as well as the logs are stored in a sub folder of the /experiments/ folder. The path is defined in the config file. You can monitor the progress of the training using tensorboard by calling tensorboard --logidr experiments/path/to/experiment.

Evaluation

For each of the scenarios there is an evaluate_***.ipynb notebook in the /evaluations/ folder that can be used to load and analyze the trained models.