This repository will hold the code and documentation for the year one situational awareness demonstrator. The demonstrator consists for four data flows:
- A user will be wearing UWB localization tags. These tags will send ranges over WiFi to an MQTT broker.
- The location and orientation of the user's VR headset will be published to MQTT.
- A drone will be flying around the space being localized by OptiTrack. The drone's location will also be published to the MQTT broker.
- On the drone will be a BLEES environmental sensor. It will send lux readings to the MQTT broker
Using these four data flows:
- The drone will fly around collecting lux data (in a sweep or random walk) while avoiding the user.
- The user will walk around the space wearing the AR/VR headset. The headset will visualize both the drone and lux data collected by the light sensor.
This demo shows exhibits several key components of CONIX:
- Drone control and obstacle avoidance
- The coordination of multiple localization systems into a single coordinate space
- The ability to visualize data in VR which simulates data that might be found in a situation calling for enhanced perception/awareness
- The standardization of data formats across multiple systems and platforms from different research groups in CONIX.