/Fast-Planner

A Robust and Efficient Trajectory Planner for Quadrotors

Primary LanguageC++GNU General Public License v3.0GPL-3.0

Fast-Planner

News

This package is under active maintenance. New features will be listed here.

  • The heading (yaw angle) planner which enables smoother change of heading direction is available.

  • The online mapping algorithm is now available. It can take in depth image and camera pose pairs as input, do raycasting to update a probabilistic volumetric map, and build a Euclidean signed distance field (ESDF) for the planning system.

Overview

Fast-Planner is a robust and efficient planning system that enables agile and fast autonomous flight for quadrotors. It takes in information from odometry, sensor streams (such as depth images and point cloud), and outputs high-quality trajectories within a few milliseconds. It can support aggressive and fully autonomous flight even in unknown and cluttered environments. Demonstrations about the planner have been reported on the IEEE Spectrum.

Authors: Boyu Zhou, Fei Gao and Shaojie Shen from the HUKST Aerial Robotics Group.

Video:

video

This package contains the implementation of Fast-Planner (in folder fast_planner) and a lightweight quadrotor simulator (in uav_simulator). Key components are:

  • plan_env: The online mapping algorithms. It takes in depth image (or point cloud) and camera pose (odometry) pairs as input, do raycasting to update a probabilistic volumetric map, and build an Euclidean signed distance filed (ESDF) for the planning system.
  • path_searching: Front-end path searching algorithms. Currently it includes a kinodynamic version of A* algorithm that respects the dynamics of quadrotors. The standard A* is also available.
  • bspline_opt: The gradient-based trajectory optimization based on B-spline trajectory representation.
  • plan_manage: High-level modules that schedule and call the mapping and planning algorithms. Interfaces for launching the whole system, as well as the configuration files are contained here.

If you use Fast-Planner for your application or research, please cite our related paper:

@article{zhou2019robust,
  title={Robust and efficient quadrotor trajectory generation for fast autonomous flight},
  author={Zhou, Boyu and Gao, Fei and Wang, Luqi and Liu, Chuhao and Shen, Shaojie},
  journal={IEEE Robotics and Automation Letters},
  volume={4},
  number={4},
  pages={3529--3536},
  year={2019},
  publisher={IEEE}
}

1. Prerequisites

  • Our software is developed and tested in Ubuntu 16.04, ROS Kinetic. Other version may require minor modification.

  • We use NLopt to solve the non-linear optimization problem.

  • The uav_simulator depends on the C++ linear algebra library Armadillo, which can be installed by sudo apt-get install libarmadillo-dev.

  • Optional: If you want to run the more realistic depth camera in uav_simulator, installation of CUDA Toolkit is needed. Otherwise, a less realistic depth sensor model will be used (See section Use GPU Depth Rendering below).

2. Build on ROS

After the prerequisites are satisfied, you can clone this repository to your catkin workspace and catkin_make. A new workspace is recommended:

  cd ${YOUR_WORKSPACE_PATH}/src
  git clone https://github.com/HKUST-Aerial-Robotics/Fast-Planner.git
  cd ../
  catkin_make

Use GPU Depth Rendering (Optional)

The local_sensing package in uav_simulator has the option of using GPU or CPU to render the depth sensor measurement. By default, it is set to CPU version in CMakeLists:

set(ENABLE_CUDA false)
# set(ENABLE_CUDA true)

The GPU version is recommended, because it generates depth images more like a real depth camera. If you want to use the GPU depth rendering, set ENABLE_CUDA to true, and also remember to change the 'arch' and 'code' flags according to your graphics card devices. You can check the right code here.

    set(CUDA_NVCC_FLAGS 
      -gencode arch=compute_61,code=sm_61;
    ) 

For installation of CUDA, please go to CUDA ToolKit

3. Run the Simulation

Run Rviz with our configuration firstly:

  <!-- go to your workspace and run: -->
  source devel/setup.bash
  roslaunch plan_manage rviz.launch

Then run the quadrotor simulator and Fast-Planner:

  <!-- open a new terminal, go to your workspace and run: -->
  source devel/setup.bash
  roslaunch plan_manage simulation.launch

Normally, you will find the randomly generated map and the drone model in Rviz. At this time, you can select a goal for the drone using the 2D Nav Goal tool. When a goal is set successfully, a new trajectory will be generated immediately and executed by the drone. A sample is displayed below:

4. Use in Your Application

If you have successfully run the simulation and want to use Fast-Planner in your project, please explore the simulation.launch file. Important parameters that may be changed in your usage are contained and documented.

Note that in our configuration, the size of depth image is 640x480. For higher map fusion efficiency we do downsampling (in kino_algorithm.xml, skip_pixel = 2). If you use depth images with lower resolution (like 256x144), you might disable the downsampling by setting skip_pixel = 1. Also, the depth_scaling_factor is set to 1000, which may need to be changed according to your device.

Finally, please kindly give a STAR to this repo if it helps your research or work, thanks! :)

5. Acknowledgements

We use NLopt for non-linear optimization.

6. Licence

The source code is released under GPLv3 license.

7. Disclaimer

This is research code, it is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of merchantability or fitness for a particular purpose.