This project contains the launchers to run the Tiago robot from PAL Robotics and Turtlebot2 Kobuki, both in simulated running different Gazebo worlds, including the AWS Robomaker worlds, as in the real robot using its drivers.
Recommended: use Eclipse Cyclone DDS.
You can do this by installing it with sudo apt install ros-humble-rmw-cyclonedds-cpp
and setting the RMW_IMPLEMENTATION
environment variable: export RMW_IMPLEMENTATION=rmw_cyclonedds_cpp
. Add it to your .bashrc
You need to have previously installed ROS2. Please follow this guide if you don't have it.
source /opt/ros/humble/setup.bash
Clone the repository to your workspace:
cd <ros2-workspace>/src
git clone https://github.com/IntelligentRoboticsLabs/ir_robots.git
Prepare your thirparty repos:
sudo apt update
sudo apt install python3-vcstool python3-pip python3-rosdep python3-colcon-common-extensions -y
cd <ros2-workspace>/src/
vcs import < ir_robots/thirdparty.repos
Please make sure that this last command has not failed. If this happens, run it again.
sudo apt install libusb-1.0-0-dev libftdi1-dev libuvc-dev
When you connect a piece of hardware to your pc, it assigns /dev/ttyUSB*
to it. This will not have the necessary read/write permissions, so we will not be able to use it correctly. The solution is to set up some udev rules that creates a symlink with another name (example: /dev/ttyUSB0
-> /dev/kobuki
) and grants it the necessary permissions.
cd <workspace-ros2>
sudo cp src/ThirdParty/ros_astra_camera/astra_camera/scripts/56-orbbec-usb.rules /etc/udev/rules.d/
sudo cp src/ThirdParty/rplidar_ros/scripts/rplidar.rules /etc/udev/rules.d/
sudo cp src/ThirdParty/kobuki_ftdi/60-kobuki.rules /etc/udev/rules.d/
sudo udevadm control --reload-rules && sudo udevadm trigger
Some cameras need a calibration file where they indicate, for example, their resolution, name, etc...
mkdir -p ~/.ros/camera_info
cp <ros2-workspace>/src/ThirdParty/openni2_camera/openni2_camera/rgb_PS1080_PrimeSense.yaml ~/.ros/camera_info
sudo rosdep init
rosdep update
rosdep install --from-paths src --ignore-src -r -y
colcon build --symlink-install --cmake-args -DBUILD_TESTING=OFF
For this build we use --cmake-args -DBUILD_TESTING=OFF
to avoid compiling our tests as well and save time. It is recommended to compile later without this flag.
source /usr/share/gazebo/setup.bash
source <ros2-workspace>/install/setup.bash
It is recommended to add these two lines inside your .bashrc
to avoid having to run it every time you open a new shell
In the computers of the university laboratories you will already have all the ros 2 packages installed, so you will not have to install any tools or dependencies. Add COLCON_IGNORE
to the packages used by real robots to avoid any problems when building.
cd <ros2-workspace>/src/
git clone https://github.com/IntelligentRoboticsLabs/ir_robots.git
vcs import < ir_robots/thirdparty.repos
touch ThirdParty/ros_astra_camera/astra_camera/COLCON_IGNORE
touch ThirdParty/ros_astra_camera/astra_camera_msgs/COLCON_IGNORE
touch ThirdParty/kobuki_ros/kobuki_node/COLCON_IGNORE
touch ThirdParty/kobuki_ros/kobuki_auto_docking/COLCON_IGNORE
touch ThirdParty/kobuki_core/COLCON_IGNORE
touch ThirdParty/kobuki_ftdi/COLCON_IGNORE
touch ThirdParty/ecl/ecl_core/COLCON_IGNORE
touch ThirdParty/ecl/ecl_lite/COLCON_IGNORE
touch ThirdParty/openni2_camera/openni2_camera_msgs/COLCON_IGNORE
touch ThirdParty/openni2_camera/openni2_camera/COLCON_IGNORE
source /opt/ros/<ros2-distro>/setup.bash
cd <ros2-workspace>
colcon build --symlink-install
Remember that in the laboratories you will only be able to run the simulation of the environment with the different robots, you will not be able to use it on a real robot.
Modify config/params.yaml
to select the robot (kobuki/tiago), select the world and starting positions:
...
ir_robots:
simulation: true
world: aws_house
robot: kobuki
robot_position:
x: 0.0
y: 0.0
z: 0.0
roll: 0.0
pitch: 0.0
yaw: 0.0
tiago_arm: no-arm
kobuki_camera: none
kobuki_lidar: false
...
Then, launch your simulation environment:
source install/setup.sh
ros2 launch ir_robots simulation.launch.py
If you have a low performance, close the Gazebo's client. Check gzclient process, and kill it:
kill -9 `pgrep -f gzclient`
First, modify config/params.yaml
to use kobuki, if you are using camara (xtion/astra/none) and if you are using lidar (true/false)
...
ir_robots:
simulation: true
world: aws_house
robot: kobuki
robot_position:
x: 0.0
y: 0.0
z: 0.0
roll: 0.0
pitch: 0.0
yaw: 0.0
kobuki_camera: astra
kobuki_lidar: true
...
Then, run the kobuki drivers:
source install/setup.sh
ros2 launch ir_robots kobuki.launch.py
You can use Nav2 using robot in the selected world in your config/params.yaml
:
source install/setup.sh
ros2 launch ir_robots navigation.launch.py
If simulation param is set to false
, Navigation2 is ready to use in the real robot.
Also, you can use Keepout Zones, just create a new map including the excluded areas, and use the same name adding _keep
, now publish the map running:
source install/setup.sh
ros2 launch ir_robots keepzone.launch.py
Just some AWS worlds are included.
If you want to make your own map, set slam parameter to true in the config/params.yaml
:
source install/setup.sh
ros2 launch ir_robots navigation.launch.py
ros2 run nav2_map_server map_saver_cli --ros-args -p use_sim_time:=true
Move the map to the maps/
folder inside the package. Then remember to rename it and modify the name inside the yaml. Finally, modify the world parameter, adding the name of your new map.
This is a project made by the Intelligent Robotics Lab, a research group from the Universidad Rey Juan Carlos. Copyright © 2023.
Maintainers:
Shield:
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.