/habitat-sim

A flexible, high-performance 3D simulator for Embodied AI research.

Primary LanguageC++MIT LicenseMIT

CircleCI codecov GitHub license Conda Version Badge Conda Platforms support Badge Documentation pre-commit Python 3.6, 3.7, 3.8 Supports Bullet Slack Join Twitter Follow

Habitat-Sim

A high-performance physics-enabled 3D simulator with support for:

The design philosophy of Habitat is to prioritize simulation speed over the breadth of simulation capabilities. When rendering a scene from the Matterport3D dataset, Habitat-Sim achieves several thousand frames per second (FPS) running single-threaded and reaches over 10,000 FPS multi-process on a single GPU. Habitat-Sim simulates a Fetch robot interacting in ReplicaCAD scenes at over 8,000 steps per second (SPS), where each ‘step’ involves rendering 1 RGBD observation (128×128 pixels) and rigid-body dynamics for 1/30sec.

Habitat-Sim is typically used with Habitat-Lab, a modular high-level library for end-to-end experiments in embodied AI -- defining embodied AI tasks (e.g. navigation, instruction following, question answering), training agents (via imitation or reinforcement learning, or no learning at all as in classical SensePlanAct pipelines), and benchmarking their performance on the defined tasks using standard metrics.

Open In Colab

Habitat Demo


Table of contents

  1. Citing Habitat
  2. Installation
  3. Testing
  4. Documentation
  5. Datasets
  6. Examples
  7. External Contributions
  8. License

Citing Habitat

If you use the Habitat platform in your research, please cite the Habitat and Habitat 2.0 papers:

@article{szot2021habitat,
  title     =     {Habitat 2.0: Training Home Assistants to Rearrange their Habitat},
  author    =     {Andrew Szot and Alex Clegg and Eric Undersander and Erik Wijmans and Yili Zhao and John Turner and Noah Maestre and Mustafa Mukadam and Devendra Chaplot and Oleksandr Maksymets and Aaron Gokaslan and Vladimir Vondrus and Sameer Dharur and Franziska Meier and Wojciech Galuba and Angel Chang and Zsolt Kira and Vladlen Koltun and Jitendra Malik and Manolis Savva and Dhruv Batra},
  journal   =     {arXiv preprint arXiv:2106.14405},
  year      =     {2021}
}

@inproceedings{habitat19iccv,
  title     =     {Habitat: {A} {P}latform for {E}mbodied {AI} {R}esearch},
  author    =     {Manolis Savva and Abhishek Kadian and Oleksandr Maksymets and Yili Zhao and Erik Wijmans and Bhavana Jain and Julian Straub and Jia Liu and Vladlen Koltun and Jitendra Malik and Devi Parikh and Dhruv Batra},
  booktitle =     {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  year      =     {2019}
}

Habitat-Sim also builds on work contributed by others. If you use contributed methods/models, please cite their works. See the External Contributions section for a list of what was externally contributed and the corresponding work/citation.

Installation

Habitat-Sim can be installed in 3 ways:

  1. Via Conda - Recommended method for most users. Stable release and nightly builds.
  2. Via Docker - Updated approximately once per year for Habitat Challenge: habitat-docker-setup.
  3. Via Source - For active development. Read build instructions and common build issues.

[Recommended] Conda Packages

Habitat is under active development, and we advise users to restrict themselves to stable releases. Starting with v0.1.4, we provide conda packages for each release.

Preparing conda env

Assuming you have conda installed, let's prepare a conda env:

# We require python>=3.6 and cmake>=3.10
conda create -n habitat python=3.6 cmake=3.14.0
conda activate habitat

conda install habitat-sim

Pick one of the options below depending on your system/needs:

  • To install on machines with an attached display:

      conda install habitat-sim -c conda-forge -c aihabitat
  • To install on headless machines (i.e. without an attached display, e.g. in a cluster) and machines with multiple GPUs:

    conda install habitat-sim headless -c conda-forge -c aihabitat
    
  • [Most common scenario] To install habitat-sim with bullet physics

    conda install habitat-sim withbullet -c conda-forge -c aihabitat
    
  • Note: Build parameters can be chained together. For instance, to install habitat-sim with physics on headless machines:

    conda install habitat-sim withbullet headless -c conda-forge -c aihabitat
    

Conda packages for older versions can installed by explicitly specifying the version, e.g. conda install habitat-sim=0.1.6 -c conda-forge -c aihabitat.

We also provide a nightly conda build for the master branch. However, this should only be used if you need a specific feature not yet in the latest release version. To get the nightly build of the latest master, simply swap -c aihabitat for -c aihabitat-nightly.

Testing

  1. Run our python data download utility to retrieve the test assets:

    python -m habitat_sim.utils.datasets_download --uids habitat_test_scenes --data-path /path/to/data/
  2. Interactive testing: Use the interactive viewer included with Habitat-Sim

    # ./build/viewer if compiling locally
    habitat-viewer /path/to/data/scene_datasets/habitat-test-scenes/skokloster-castle.glb

    You should be able to control an agent in this test scene. Use W/A/S/D keys to move forward/left/backward/right and arrow keys or mouse to control gaze direction (look up/down/left/right). Try to find the picture of a woman surrounded by a wreath. Have fun!

  3. Physical interactions: If you would like to try out habitat with dynamical objects using the interactive viewer: First setup the test object assets by running the data download utility:

    python -m habitat_sim.utils.datasets_download --uids habitat_example_objects --data-path /path/to/data/

    To run an interactive C++ example GUI application with physics enabled run

    # ./build/viewer if compiling locally
    habitat-viewer --enable-physics --object-dir data/objects/example_objects -- data/scene_datasets/habitat-test-scenes/apartment_1.glb

    The viewer application will output user interface help to the console at runtime.

  4. Non-interactive testing: Run the example script:

    python examples/example.py --scene /path/to/data/scene_datasets/habitat-test-scenes/skokloster-castle.glb

    The agent will traverse a particular path and you should see the performance stats at the very end, something like this: 640 x 480, total time: 3.208 sec. FPS: 311.7. Note that the test scenes do not provide semantic meshes. If you would like to test the semantic sensors via example.py, please use the data from the Matterport3D dataset (see Datasets). We have also provided an example demo for reference.

    To run a physics example in python (after building with "Physics simulation via Bullet"):

    python examples/example.py --scene /path/to/data/scene_datasets/habitat-test-scenes/skokloster-castle.glb --enable_physics

    Note that in this mode the agent will be frozen and oriented toward the spawned physical objects. Additionally, --save_png can be used to output agent visual observation frames of the physical scene to the current directory.

Common testing issues

  • If you are running on a remote machine and experience display errors when initializing the simulator, e.g.

     X11: The DISPLAY environment variable is missing
     Could not initialize GLFW

    ensure you do not have DISPLAY defined in your environment (run unset DISPLAY to undefine the variable)

  • If you see libGL errors like:

     X11: The DISPLAY environment variable is missing
     Could not initialize GLFW

    chances are your libGL is located at a non-standard location. See e.g. this issue.

Documentation

Browse the online Habitat-Sim documentation.

To get you started, see the Lighting Setup tutorial for adding new objects to existing scenes and relighting the scene & objects. The Image Extractor tutorial shows how to get images from scenes loaded in Habitat-Sim.

Questions?

Join the Slack channel. Slack Join

Datasets

  • The full Matterport3D (MP3D) dataset for use with Habitat can be downloaded using the official Matterport3D download script as follows: python download_mp.py --task habitat -o path/to/download/. You only need the habitat zip archive and not the entire Matterport3D dataset. Note that this download script requires python 2.7 to run.
  • The Gibson dataset for use with Habitat can be downloaded by agreeing to the terms of use in the Gibson repository.
  • Semantic information for Gibson is available from the 3DSceneGraph dataset. The semantic data will need to be converted before it can be used within Habitat:
    tools/gen_gibson_semantics.sh /path/to/3DSceneGraph_medium/automated_graph /path/to/GibsonDataset /path/to/output
    To use semantics, you will need to enable the semantic sensor.
  • To work with the Replica dataset, you need a file called sorted_faces.bin for each model. Such files (1 file per model), along with a convenient setup script can be downloaded from here: sorted_faces.zip. You need:
  - Download the file from the above link;
  - Unzip it;
  - Use the script within to copy each data file to its corresponding folder (You will have to provide the path to the folder containing all replica models. For example, ~/models/replica/);

Examples

Load a specific MP3D or Gibson house: examples/example.py --scene path/to/mp3d/house_id.glb.

Additional arguments to example.py are provided to change the sensor configuration, print statistics of the semantic annotations in a scene, compute action-space shortest path trajectories, and set other useful functionality. Refer to the example.py and demo_runner.py source files for an overview.

To reproduce the benchmark table from above run examples/benchmark.py --scene /path/to/mp3d_example/17DRP5sb8fy/17DRP5sb8fy.glb.

External Contributions

  • If you use the noise model from PyRobot, please cite the their technical report.

    Specifically, the noise model used for the noisy control functions named pyrobot_* and defined in habitat_sim/agent/controls/pyrobot_noisy_controls.py

  • If you use the Redwood Depth Noise Model, please cite their paper

    Specifically, the noise model defined in habitat_sim/sensors/noise_models/redwood_depth_noise_model.py and src/esp/sensor/RedwoodNoiseModel.*

License

Habitat-Sim is MIT licensed. See the LICENSE for details.