/SiTAR

Resources for SiTAR, a situated trajectory analysis system for AR which provides in-the-wild pose error estimates

Primary LanguageC#MIT LicenseMIT

SiTAR (Situated Trajectory Analysis for AR)

This repository contains resources and research artifacts for the paper "SiTAR: Situated Trajectory Analysis for In-the-Wild Pose Error Estimation" that appeared in Proceedings of IEEE ISMAR 2023. It includes the code required to implement SiTAR, as well as samples of the new open-source VI-SLAM datasets we created to evaluate our pose error estimation method.

To create our new VI-SLAM datasets we used our previously published game engine-based emulator, Virtual-Inertial SLAM. For more information on this tool, implementation code and instructions, and examples of the types of projects it can support, please visit the Virtual-Inertial SLAM GitHub repository.

SiTAR Overview

Our SiTAR system provides situated visualizations of device pose error estimates, on real AR devices (implemented here for ARCore). Our code facilitates three types of pose error visualizations, illustrated in the image below -- 1) trajectory-only (left), 2) trajectory + exclamation points (middle), 3) trajectory + warning signs (right):

SiTAR teaser image

The system architecture for SiTAR is shown below. The system frontend which generates situated trajectory visualizations is implemented on the user AR device, and the system backend which generates pose error estimates is implemented on a server and playback AR device(s). The system backend can be implemented using an edge or cloud server.

SiTAR system architecture

Below is a short demo video of our SiTAR system in action, using an edge-based architecture. A Google Pixel 7 Pro is used as the User AR device, an Apple Macbook Pro as the server, and a Google Pixel 7 as the playback AR device. The video shows the following steps:

  1. Creation of a trajectory on the user AR device ('Trajectory creation').
  2. Replaying of the visual and inertial input data for that trajectory on the playback AR device to obtain multiple trajectory estimates ('Sequence playback').
  3. Situated visualization of the trajectory on the user AR device before pose error estimates are added ('Trajectory visualization without error estimates').
  4. Our uncertainty-based pose error estimation running on the server ('Uncertainty-based error estimation').
  5. Situated visualization of the trajectory on the user AR device once pose error estimates are added, with high pose error associated with the blank wall highlighted using our 'trajectory + exclamation points' visualization ('Trajectory visualization with error estimates').

SiTAR demo video

Implementation Resources

Our implementation code and associated resources for SiTAR are provided in three parts, for the user AR device, the server and the playback AR device respectively. The code for each can be found in the repository folders named 'user-AR-device', 'server', and 'playback-AR-device'. The implementation resources consist of the following:

User AR device: A C# script DrawTrajectory.cs, which implements the 'Trajectory creation' and 'Trajectory visualization' modules in SiTAR. Unity prefabs for base trajectory visualization, Start.prefab, Stop.prefab, Cylinder.prefab, Joint.prefab and Frustum.prefab. Unity prefabs and materials for pose error visualizations, ErrorAreaHigh.prefab, ErrorAreaMedium.prefab, ErrorPatchHigh.prefab, ErrorPatchMedium.prefab, ErrorHigh.mat and ErrorMedium.mat.

Server: a Python script SiTAR-Server.py, which implements the 'Sequence assignment' and 'Uncertainty-based error estimation' modules in SiTAR.

Playback AR device: a C# script TrajectoryPlayback.cs, which implements the 'Sequence playback' module in SiTAR.

Implementation Instructions

Prerequisites: 2 or more Android devices running ARCore v1.3 or above; server with Python 3.8 or above and the evo (https://github.com/MichaelGrupp/evo) and FastAPI (https://fastapi.tiangolo.com/lo/) Python packages installed, and Android SDK Platform Tools installed (https://developer.android.com/tools/releases/platform-tools). For building the necessary apps to AR devices, Unity 2021.3 or later is required, with the AR Foundation framework v4.2 or later and the ARCore Extensions v1.36 or later packages installed.

Tested with Google Pixel 7 and Google Pixel 7 Pro devices running ARCore v1.31, and Apple Macbook Pro as edge server (Python 3.8).

User AR device:

  1. Create a Unity project with the AR Foundation template. Make sure the ARCore Extensions is fully set up by following the instructions here: https://developers.google.com/ar/develop/unity-arf/getting-started-extensions.
  2. Add the DrawTrajectory.cs script (in the user-AR-device folder) to the AR Session Origin GameObject.
  3. Drag the AR Camera GameObject to the 'Camera Manager' and 'Camera' slots in the Draw Trajectory inspector panel.
  4. Add the Start.prefab, Stop.prefab, Cylinder.prefab, Joint.prefab and Frustum.prefab files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Start Prefab', 'Stop Prefab', 'Cylinder Prefab', 'Joint Prefab' and 'Frustum Prefab' slots in the Draw Trajectory inspector panel.
  5. (Optional) If using the exclamation points or warning signs visualizations, add the ErrorAreaHigh.prefab, ErrorAreaMedium.prefab, ErrorPatchHigh.prefab, and ErrorPatchMedium.prefab files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Error Area High Prefab', 'Error Area Medium Prefab', 'Error Patch High Prefab', and 'Error Patch Medium Prefab' slots in the Draw Trajectory inspector panel.
  6. Add the ErrorHigh.mat and ErrorMedium.mat files (in the user-AR-device folder) to your Assets folder, and drag them to the 'Error High' and 'Error Medium' slots in the Draw Trajectory inspector panel.
  7. Add Start and Stop UI buttons, drag them to the 'Start Button' and 'Stop Button' slots in the Draw Trajectory inspector panel, and set their OnClick actions to 'DrawTrajectory.HandleStartClick' and 'DrawTrajectory.HandleStopClick' respectively.
  8. Either hardcode your server IP address into line 481 of DrawTrajectory.cs, or add a UI panel with a text field to capture this data from the user.
  9. (Optional) Add UI text objects to display SiTAR status, trajectory duration, length, average environment depth, and drag them to the 'Status', 'Trajectory Duration', 'Trajectory Length' and 'Trajectory Depth' slots in the Draw Trajectory inspector panel.
  10. (Optional) Add audio clips for notifying when error estimates are ready, user captures image, and user has captured all regions, and drag them to the 'Audio Results', 'Audio Capture' and 'Audio Complete' slots in the Draw Trajectory inspector panel.
  11. Set the Build platform to Android, select your device under Run device, and click Build and Run.

Server:

  1. Create a folder on the server where SiTAR files will be located. Add an additional sub-folder named 'trajectories'.
  2. Download the server folder in the repository to your SiTAR folder.
  3. Open the SiTAR-Server.py file in the server folder, complete the required configuration parameters on lines 20-29, and save.
  4. In Terminal or Command Prompt, navigate to your SiTAR folder.
  5. Start the server using the following command: uvicorn server.SiTAR-Server:app --host 0.0.0.0

Playback AR device:

  1. Create a Unity project with the AR Foundation template. Make sure the ARCore Extensions is fully set up by following the instructions here: https://developers.google.com/ar/develop/unity-arf/getting-started-extensions.
  2. (Optional) Add the AR Plane Manager and AR Point Cloud Manager scripts (included in AR Foundation) to the AR Session Origin GameObject if you wish to visualize planes and feature points during playback.
  3. Add the TrajectoryPlayback.cs script (in the playback-AR-device folder) to the AR Session GameObject.
  4. Create a UI text object to display log messages, and drag it to the 'Log' slot in the Trajectory Playback inspector panel.
  5. Drag the AR Camera GameObject to the 'Camera Manager' and 'Camera' slots in the Trajectory Playback inspector panel.
  6. Set the Build platform to Android, select your device under Run device, and click Build and Run.

Datasets

Our Hall and LivingRoom VI-SLAM datasets that we created to evaluate our uncertainty-based pose error estimation method can be downloaded here: https://drive.google.com/drive/folders/1VwAgcCly0RDUmyME4MHDrcBfkXRbitpC?usp=sharing .

Each dataset is contained in a separate folder (e.g., Hall.zip), which contains sub-folders for each sequence, along with the required ORB-SLAM3 configuration file, config.yaml (which contains camera intrinsics and extrinsics, imu noise parameters, ORB extractor parameters and visualization settings). Each sequence folder contains the following (formatted to streamline execution in ORB-SLAM3):

  1. groundtruth folder, containing formatted ground truth pose for sequence (data.csv) plus sensor characteristics from original SenseTime dataset (sensor.yaml).
  2. mav0 folder, containing cam0/data folders with camera images, and imu0 folder with formatted IMU data (data.csv) plus sensor characteristics from original SenseTime dataset (sensor.yaml).
  3. sequence_name.txt file (e.g., A1.txt), containing list of camera image timestamps (format required by ORB-SLAM3).

Citation

If you use SiTAR in an academic work, please cite:

@inproceedings{SiTAR,
  title={SiTAR: Situated trajectory analysis for in-the-wild pose error estimation},
  author={Scargill, Tim and Chen, Ying and Hu, Tianyi and Gorlatova, Maria},
  booktitle={Proceedings of IEEE ISMAR 2023},
  year={2022}
 }

Acknowledgements

The authors of this repository are Tim Scargill and Maria Gorlatova. Contact information of the authors:

  • Tim Scargill (timothyjames.scargill AT duke.edu)
  • Maria Gorlatova (maria.gorlatova AT duke.edu)

This work was supported in part by NSF grants CSR-1903136, CNS-1908051 and CNS-2112562, NSF CAREER Award IIS-2046072, a Meta Research Award and a CISCO Research Award.