Control a mobile robot on a 2D map using hand gestures with the MediaPipe library, replacing traditional WASD keyboard inputs.
demo-control-mobile-robot.mp4
- Ubuntu 20.04
- ROS Noetic
- MediaPipe
- Stage ROS
Follow the official ROS installation instructions for Ubuntu 20.04. Make sure to configure your environment by sourcing the ROS setup script:
echo "source /opt/ros/noetic/setup.bash" >> ~/.bashrc
source ~/.bashrc
Create a new ROS workspace to host the packages:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make
Clone this repository into the src
directory of your workspace:
cd ~/catkin_ws/src
git clone <repository-url> hand_gesture_control
cd ..
catkin_make
source devel/setup.bash
To run the project, open three terminals and execute the following commands in sequence:
Start the ROS master node:
roscore
In a new terminal, launch Stage ROS with a predefined world for simulation:
rosrun stage_ros stageros $(rospack find stage_ros)/world/willow-erratic.world
This command opens a 2D simulation world where the robot will navigate.
Finally, in another new terminal, launch the hand gesture control nodes:
roslaunch hand_gesture_control hand_gesture_control_launch.launch
This command starts the nodes necessary for hand gesture recognition and robot control.
After launching all components:
- Stage ROS window displays the robot in a 2D environment.
- MediaPipe node processes hand gestures and translates them into robot movement commands.
We welcome contributions! Please read our contributing guidelines to learn about our review process, coding conventions, and more. Contributions can be made via pull requests to the repository.