This project deals with the implementation of human detection and tracking. The quadrotor used is ModalAI m500 model with VOXL Flight autonomy computer and PX4 flight controller.
Summary of tasks achieved:
- Implemented ROS node for accessing and viewing Yolo v5 output.
- Controlled yaw angle based on bounding box center.
- Setup 1D LiDAR for forward and backward motion.
- Programmed PD controllers for both motions.
normal_third_person_view.online-video-cutter.com.1.mp4
normal_drone_camera_feed.online-video-cutter.com.mp4
These are the instructions to get started on the project. To get a local copy up and running follow these simple steps.
- OS - Linux (tested)
- Object Detection model - Yolo v5
- Hardware - Hires Camera, 1D LiDAR
- Software - PX4, ROS 1, C++
- Package required - voxl_mpa_to_ros
- Docker - ROS melodic with opencv1.2 (mavros and mavros_extras required)
- Ground Station - Qgroundcontrol
- Add the following inside
bashrc
of your local machine.export ROS_MASTER_URI=<Drone_IP>
- Setup
tflite
service for yoloV5 and select hires camera. - Clone package
human_tracking
intosrc
of workspace andcatkin_make
.
For starting offboard_mode
. Open 4 terminals and do the following.
- First terminal.
ssh
into drone and source ROS variables. Then enter following command to startvoxl_mpa_to_ros
roslaunch voxl_mpa_to_ros voxl_mpa_to_ros.launch
- Second terminal. Get into docker image and source the workspace and run following to start controller.
roslaunch human_tracking follower.launch
- Third terminal. Get into docker image and source the workspace and run following to view camera output with bounding boxes.
rosrun human_tracking view
- Fourth terminal. Source
.bashrc
of local machine and runrqt
and select following topic for image view./human_bb_view
Distributed under the MIT License. See MIT for more information.