This work involves the application of neural networks to program drones and thus test their performance in such systems. It will be implemented in two applications: initially, a line-following application, where the drone must follow a line in various circuits with the least possible error; and simultaneously, it will have to cross windows, which will consist of a mixed control where the drone will be teleoperated, but if the pilot wishes, the neural network will take control of the drone and attempt to cross the gates that appear in its field of vision.
Usually, classical programming algorithms are used for drone control. The control of these drones is typically done with cameras and various sensors that require intensive processing. This means that the drone needs a processing unit as a payload to handle all the data, or alternatively, the data must be processed at the ground station, which adds a delay in communications and potential errors that could affect a system that is critical to operate in real-time.
This project has two packages: one contains all the drone platforms, and the other contains the behaviors programmed into the drone_behaviors packages. The drone is able to perform two applications: the line-following application and the gate-traversing application.
We follow the following repository distribution for machine learning projects, along with the basic ROS structure.
The models and the dataset occupy significant space, so it was decided to use hugging Face to host all this data. Thus, a repository was created for the dataset and another for the models
We used de aerostack2 platforms to utilize all the programmed behaviors in multiple drones. The platform launchers are in the package drone_platforms This package was created to separate the platform used from the behavior, thus allowing the developed software to be used in real drones in the future.
- For launching follow line world:
cd /PATH_TO_PACKAGE/drone_platforms/launch
ros2 launch drone_platforms as2_sim_circuit.launch.py world:=/PATH_TO_WORLD/NAME.world yaw:=3.14
- For launching the gates world:
cd /PATH_TO_PACKAGE/drone_platforms/src
# Execute if you want to change the scenario
python3 python3 generateGateWorld.py
ros2 ros2 launch drone_platforms as2_sim_gates.launch.py
Bot applications follows the same training scripts so that is how the basic training script is called:
- Access to the src drone_behaviors directory:
cd /PATH_TO_PACKAGE/drone_behaviors/src/
- Converting rosbags to standard format dataset:
python3 features/rosbag2generalDataset.py --rosbags_path ROSBAGS_PATH
- Train a neural network with the standard format dataset, this script will receive the path of the dataset and will train the model specified in PATH_TO_NEW_NETWORK.tar:
## model: Selects the model the user wants to train
## resume: Resume the training or creates a new one
## network: Network path
## Supports 4 dataset paths (--dp1, --dp2, --dp3, --dp4)
python3 models/train.py --model [pilotNet|deepPilot] --resume [true or false] --network PATH_TO_NEW_NETWORK.tar --dp1 STD_DATASET_PATH1
- Back-up script for saving different weights distribution in one training:
./utils/netCheckpointSaver.sh MODEL_PATH.tar
For the imitation learning validation in this exercise, we collected a dataset with an algorithmic pilot. Clicking the next image you can watch the demo video:
For this application, the Pilot_Net network was used to control linear and angular velocity. The expert pilot is launched as:
#! First launch the platform
## out_dir = Path where all the execution data will be stored.
## trace = Shows the filtered image in another window
ros2 launch drone_behaviors expertPilot.launch.py out_dir:=PATH trace_arg:=BOOL
If you want to launch the neural pilot instead the expert pilot:
#! First launch the platform
## out_dir = Path where all the execution data will be stored.
## trace = Shows the filtered image in another window
## network_path = Model used for the velocity inference
ros2 launch drone_behaviors neuralPilot.launch.py out_dir:=PATH trace_arg:=BOOL network_path:=NET_PATH
In this application, automation scripts were created to facilitate training. To repeatedly launch the expert pilot while recording the dataset:
## TIME_RECORDING = Integer that sets the recording time for each circuit.
## OUT_DIR = Path where all the execution data will be stored.
cd /PATH_TO_PACKAGE/drone_behaviors/utils
./generateDataset.sh TIME_RECORDING OUT_DIR
After that you can use the automate training:
./generateNeuralPilot OUT_DIR MODEL_DIR
For forcing the exit of a tmux session:
./end_tmux.sh SESSION_NAME
To speed up the model selection process for this application, the following script was created. It executes a folder containing 𝑛 models and generates an image of the result for each on as the images below:
In this application we collected two datasets, one more generic for the pilotNet training. The results of this application where successful too:
In this video, you can see the drone maintaining a constant altitude:
In this video, you can see the drone adjusting its altitude:
Since this expert pilot will be autopiloted, the following control configuration was chosen to teleoperate the drone:
Launch this expert pilot with the following command:
#! First launch the platform
## out_dir = Path where all the execution data will be stored.
## net_dir = PilotNet model path
## deep_dir = DeepPilot model path
ros2 launch drone_behaviors remoteControl.launch.py out_dir:=PATH net_dir:=PILOT_NET_MODEL deep_dir:=DEEP_PILOT_MODEL
This work is licensed under a Apache license 2.0