/NeRFCapture

An iOS app that collects/streams posed images for NeRFs using ARKit

Primary LanguageSwiftMIT LicenseMIT

NeRF Capture

Collecting NeRF datasets is difficult. NeRF Capture is an iOS application that allows any iPhone or iPad to quickly collect or stream posed images to InstantNGP. If your device has a LiDAR, the depth images will be saved/streamed as well. The app has two modes: Offline and Online. In Offline mode, the dataset is saved to the device and can be accessed in the Files App in the NeRFCapture folder. Online mode uses CycloneDDS to publish the posed images on the network. A Python script then collects the images and provides them to InstantNGP.

Download on the App Store

Online Mode

Use the Reset button to reset the coordinate system to the current position of the camera. This takes a while; wait until the tracking initialized before moving away.

Switch the app to online mode. On the computer running InstantNGP, make sure that CycloneDDS is installed in the same python environment that is running pyngp. OpenCV and Pillow are needed to save and resize images.

pip install cyclonedds

Check that the computer can see the device on your network by running in your terminal:

cyclonedds ps

Instructions found in here

Offline Mode

In Offline mode, clicking start initializes the dataset. Take a few images then click End when you're done. The dataset can be found as a zip file in your Files App in the format that InstantNGP expects. Unzip the dataset and drag and drop it into InstantNGP. We have found it farely difficult to get files transferred from an iOS device to another computer so we recommend running the app in Online mode and collecting the dataset with the nerfcapture2nerf.py script found in InstantNGP.

Citation

If you use this software in your research, please consider citing it.

@misc{
  NeRFCapture,
  url={https://github.com/jc211/NeRFCapture},
  journal={NeRFCapture},
  author={Abou-Chakra, Jad},
  year={2023},
  month={Mar}
}