Build status |
---|
This package contains the core signal processing and pose estimation software components of the UVDAR system necessary for running it on a MAV.
- the UVDAR system is a visual mutual relative localization system for cooperating micro-scale UAVs
- based on ultraviolet-sensitive cameras and blinking ultraviolet markers
- can be used both indoors and outdoors without infrastructure
- robust to a variety of lighting conditions
-
One or more calibrated (Using the OCamCalib model) grayscale camera sensors with ultraviolet bandpass filters. In our setup these are:
- mvBlueFOX MLC200wG cameras
- Sunnex DSL215 lenses with ~180 degrees of horizontal FOV
- Midopt BP365-R6 filters with our custom holder between the sensor and the lens
-
Blinking ultraviolet LEDs (395nm) attached to extreme points of the target UAVs. In our setup, these are:
- ProLight Opto PM2B-1LLE
- Attached to the ends of the arms of the UAVs %% * , respectively on the top of the UAVs for "beacons"
- For quadrotors, the markers comprise two LEDs each, rotated 90° from each other in the "yaw" axis of the UAV
-
- Implemented as our open-hardware board and firmware
- ROS (Robot Operating System) Melodic Morenia
- mrs_lib - ROS package with utility libraries used by the MRS group
- mrs_msgs - ROS package with message types used by the MRS group
- bluefox2 - ROS package providing interface with mvBlueFOX cameras
- mrs_uav_system - Our ROS-based ecosystem for flying and testing multi-UAV systems
- uvdar_gazebo_plugin - Emulation library that produces meta-data that is used for generation of synthetic UV LED image stream in simulation
Install the dependencies.
Clone this repository into a ROS workspace as a package.
If you are using the mrs_modules
meta package (currently only accessable internally to MRS staff, to be released at later date), this repository is already included.
Build the package using catkin tools (e.g. catkin build uvdar_core
)
In order to test the system in simulation, install all software dependencies including those designated for testing in simulation (Above) and run this script in the scripts folder: %% * For testing separation of units based on position and beacons use beacon_test.sh %% * For testing separation of units based on different blinking frequencies multi_frequency_test.sh
- For testing separation of units based on different blinking signal sequences new_signaling.sh
Note, that the script slows down the simulation below real-time. This is necessary, since if Gazebo slows down automatically due to insufficient processing power, the blinking signals get corrupted. Test the maximum admissible real-time factor for your computer by checking how far you can increase / have to decrease it such that the real-time factor consistently stays at the value it was set to.
The package comprises multiple ROS nodes (N) and nodelets (n):
-
UVDARDetector - n - detects bright points from the UV camera image. These are used as candidates for UV LED markers
-
UVDARBlinkProcessor - n - Extracts blinking signals and image positions of the markers detected previously
-
UVDARPoseCalculator - N - Calculates approximate pose and error covariances of the MAVs carrying the UV LED markers. Note, that this is an example for the specific layouts on quadrotors and hexarotors we use. If you need a different layout, you will also need to write your custom pose calculation
-
UVDARKalman - N - Filters out sets of detected poses with covariances based on positions or the included identities. This filtering occurs in a selected coordinate frame
-
UVDARBluefoxEmulator - n - Generates an image stream similar to the output of our Bluefox cameras with UV bandpass filters (above). This image is currently rudimentary, with background of a constant shade of grey and white circles where the markers appeared. The function of this node depends on our uvdar_gazebo_plugin, with which it needs to communicate
-
MaskGenerator - N - Generates masks for specific camearas on specific MAVs. This is necessary to suppress detections of marers on the given observer, as well as reflection from e.g. its metallic surfaces in front of the camera. The masks can be also generated manually.
-
LedManager - N - Sends commands to our controller boards that set the signals of the blinking UV LEDs on the current MAV using the Baca Protocol. You can command this node using ROS services. This is loaded using led_manager.launch.
This work would not be possible without the hard work, resources and support from the Multi-Robot Systems (MRS) group from Czech Technical University.
This package contains the following third-party libraries by the respective authors: