Use the nvidia jetbot kit to construct a floor plan autonomously.
Startup the jetbot and do the following basic setup:
git clone https://github.com/ThomasR155/jetbot_ros
cd jetbot_ros
docker/run.sh
Start the jetbot motors and camera:
ros2 launch jetbot_ros jetbot_nvidia.launch.py
Then to run the following commands, launch a new terminal session into the container:
sudo docker exec -it jetbot_ros /bin/bash
ros2 launch jetbot_ros teleop_keyboard.launch.py
The keyboard controls are as follows:
w/x: increase/decrease linear velocity
a/d: increase/decrease angular velocity
space key, s: force stop
Press Ctrl+C to quit.
Run this from inside the container, substituting the path of the dataset that you collected (by default, it will be in a timestamped folder under /workspace/src/jetbot_ros/data/datasets/
)
cd /workspace/src/jetbot_ros/jetbot_ros/dnn
python3 train.py --data /workspace/src/jetbot_ros/data/datasets/20211018-160950/
Install Rplidar Driver for ROS2 (from source)
Make sure you have a suitable Lidar connected and mounted to your jetbot via USB just like that:
Then start the Lidar driver by running:
ros2 run rplidar_ros rplidarNode
Install rf2o_laser_odometry (from source)
Odometry will be started automatically, if you want to manually start it run:
ros2 launch rf2o_laser_odometry rf2o_laser_odometry.launch.py
Install slam_toolbox
Place the robot in your room, make sure it's unblocked.
Run slam + odometry with the correct settings:
ros2 launch jetbot_ros jetbot_custom_slam.launch.py
Use your custom trained model to navigate the room autonomously. Substitute the path to your model below:
ros2 launch jetbot_ros nav_model.launch.py model:=/workspace/src/jetbot_ros/data/models/202106282129/model_best.pth
create the floor plan by letting the robot navigate through the room autonomously:
ros2 service call /slam_toolbox/save_map slam_toolbox/srv/SaveMap "name: data:'map.pgm'"
We would like to express our gratitude to Prof. Dr. Patrick Glauner for giving very interesting lectures and providing the hardware for this project.
Project team: Thomas Riedl, Leon Madest, Darwin Sucuzhañay and Ankit Singh Rawat