/repair_ros_robot

ROS package to control the simulated and real robot

Primary LanguagePython

RePAIR ROS Robot

1) Description

This repository contains the software to control the simulated and real RePAIR robot.

Dependencies

2) Installation

  • Clone the repository along with the submodules

     mkdir -p ~/repair_robot_ws/src && cd ~/repair_robot_ws/src
    
     git clone --recurse-submodules -j8 https://github.com/RePAIRProject/repair_ros_robot.git
  • Install required Python packages

     cd ~/repair_robot_ws/src/repair_ros_robot
     pip3 install -r requirements.txt
  • Build the workspace

    • source ROS (source /opt/ros/noetic/setup.bash) in all terminals
     cd ~/repair_robot_ws
    
     catkin build
    Troubleshooting
    If you get errors during build similar to (where package name is some name):
     CMake Error at /opt/ros/noetic/share/catkin/cmake/catkinConfig.cmake:83 (find_package):
     Could not find a package configuration file provided by
     "package_name" with any of the following names:
    
     	package_nameConfig.cmake
     	package_name-config.cmake
    
     Add the installation prefix of "package_name" to CMAKE_PREFIX_PATH
     or set "package_name" to a directory containing one of the
     above files.  If "package_name" provides a separate development
     package or SDK, be sure it has been installed.

    Check the list below:

    Failure for realsense2 (missing "ddynamic_reconfigure")

    From this issue it looks like it should be installed by running:

     sudo apt-get install ros-noetic-ddynamic-reconfigure 

    Failure for repair_moveit_xbot (missing "moveit_ros_planning")

    Install moveit by

     sudo apt-get install ros-noetic-moveit
    

    Failure for repair_moveit_xbot (missing "rviz_visual_tools")

    Install it by

     sudo apt-get install ros-noetic-rviz-visual-tools
    

    Failure for repair_moveit_xbot (missing "moveit_visual_tools")

    Install it by

     sudo apt-get install ros-noetic-moveit-visual-tools 
    
    • After successful build, source the workspace in all the terminals
     cd ~/repair_robot_ws
    
     source devel/setup.bash
    
  • For Gazebo:

    • Copy the fragment model folder from repair_urdf/sdf/frag3 to ~/.gazebo/models/frag3.
    • Change the path in file pysdf/src/pysdf/parser.py line 26 to your catkin_ws src path.
  • Install XBot2 to use the real robot drivers (Source)

     sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
    
    sudo apt install curl 
    curl -s https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc | sudo apt-key add -
    
    sudo apt update && sudo apt install -y \
    ros-noetic-ros-base \
    libgazebo11-dev
    
    echo ". /opt/ros/noetic/setup.bash" >> ~/.bashrc
    
    source $HOME/.bashrc
    sudo apt install -y \
    ros-$ROS_DISTRO-urdf ros-$ROS_DISTRO-kdl-parser \
    ros-$ROS_DISTRO-eigen-conversions ros-$ROS_DISTRO-robot-state-publisher ros-$ROS_DISTRO-moveit-core \
    ros-$ROS_DISTRO-rviz ros-$ROS_DISTRO-interactive-markers ros-$ROS_DISTRO-tf-conversions ros-$ROS_DISTRO-tf2-eigen \
    qttools5-dev libqt5charts5-dev qtdeclarative5-dev
    
    sudo sh -c 'echo "deb http://xbot.cloud/xbot2/ubuntu/$(lsb_release -sc) /" > /etc/apt/sources.list.d/xbot-latest.list'
    wget -q -O - http://xbot.cloud/xbot2/ubuntu/KEY.gpg | sudo apt-key add -  
    sudo apt update
    sudo apt install xbot2_desktop_full
    
    echo ". /opt/xbot/setup.sh" >> ~/.bashrc
  • Setup XBot2 to use the real robot drivers set_xbot2_config ~/repair_robot_ws/src/repair_ros_robot/repair_cntrl/config/repair_basic.yaml

For docs on repair_interface, go to the repair_interface.

3) Usage

RVIZ

RVIZ visualization with joint sliders

roslaunch repair_urdf repair_full_slider.launch 

RVIZ visualization without joint sliders

roslaunch repair_urdf repair_full.launch

Gazebo simulation

View the robot in Gazebo

roslaunch repair_gazebo repair_gazebo.launch
  • You can ignore the following error messages, the model uses position controllers while p gains are only needed for effort controllers [ERROR] [1675347973.116238028]: No p gain specified for pid. Namespace: /gazebo_ros_control/pid_gains/x_joint

Motion planning and execution with Moveit and ros_control in Gazebo

roslaunch repair_gazebo bringup_moveit.launch launch_gazebo:=true sh_version:=v1_2_research fixed_hands:=false
  • launch_gazebo:=true/false (default false): if true launches Gazebo for simulation, if false (default) real robot
  • sh_version:=v1_2_research/v1_wide/mixed_hands (default v1_2_research): to use standard (small) hand/wide hand/standard hand on right and wide hand on left
  • fixed_hands:=true/false (default true): true is needed for planning with real robot, but you cannot plan in simulation, set it to false to plan in Gazebo

Errors and warnings

  • You can ignore the following error messages, the model uses position controllers while p gains are only needed for effort controllers [ERROR] [1675347973.116238028]: No p gain specified for pid. Namespace: /gazebo_ros_control/pid_gains/x_joint
  • You can ignore the warning messages about unknown links in URDF (e.g. [ WARN] [1706696422.918657977, 1.124000000]: Link 'right_hand_v1_2_research_thumb_proximal_link' is not known to URDF. Cannot disable/enable collisons.), they doesn't affect the run of the simulation, you won't only be able to see the hand opening/closing in Rviz

XBot2

XBot2 is required when you want to control the real robot. Furthermore, there is a dummy mode that can be used to emulate the real robot interface. Using the dummy mode allows to use RVIZ with Moveit with the real robot controls instead of ros_control. Currently, this repository does not support using the dummy mode with Gazebo.

Dummy mode

  • First, you have to configure your .bashrc so that the roscore is running on your local machine. For this purpose, add the following lines to your .bashrc.

     export ROS_MASTER_URI=http://{local_IP}:11311
     export ROS_IP={local_IP}
  • Then, source your .bashrc and start the roscore in window 1.

     roscore
  • Start XBot2 in window 2.

     xbot2-core --hw dummy
  • Now you you have to start the bridge between XBot2 and ROS in window 3.

     rosrun repair_moveit_xbot moveit_xbot_bridge_node
  • Finally, in window 4 you can start RVIZ and Moveit to control the emulated robot.

     roslaunch repair_moveit_xbot bringup_moveit.launch

Real robot

  • First, you have to configure your .bashrc so that the roscore is running on the robot PC. For this purpose, add the following lines to your .bashrc.

     export ROS_MASTER_URI=http://{robot_IP}:11311
     export ROS_IP={local_IP}
  • Then, source your .bashrc and connect via ssh to the real robot PC. You will need at least 3 remote command windows.

     ssh -X {host_name}@{robot_IP}
  • Check in remote window 1 that the roscore is running. In case the roscore is not running you can restart it using the system command systemctl --user restart roscore.service

     rostopic list
  • Then used the same window to start the ecat_master.

     ecat_master
  • Now start XBot in remote window 2

     xbot2-core --hw ec_pos

    This starts the motors to be controllable with position control. Alternatively you can start them in idle mode using xbot2-core --hw idle

  • Now you have to manually start the motor cooling fans in remote window 3.

     rosservice call /ec_client/set_motors_fan "motor_name: ['']
     position: [0]
     velocity: [0]
     torque: [0]
     amperage: [0]
     homing_position: [0]
     fan: [true]
     led: [false]"
  • On your local PC you will need at least 3 command windows. In command window 1, you have to open the robot GUI. Here, you have to move the robot to the home position by clicking on the Home button.

     xbot2-gui
  • In local window 2 you have to start the bridge between XBot and Moveit.

     rosrun repair_moveit_xbot moveit_xbot_bridge_node
  • Afterwards, you can start Moveit in local window 3.

     roslaunch repair_moveit_xbot bringup_moveit.launch

Moveit configuration

  • To increase/reduce the number of points for a trajectory, update the following parameter for arm_1 and arm_2 in repair_moveit_config_v2/config/ompl_planning.yaml

     longest_valid_segment_fraction: 0.00005
  • To increase/decrease the velocity of arm joints, update the following parameter in repair_moveit_config_v2/config/joint_limits.yaml

     default_velocity_scaling_factor: 0.1
     default_acceleration_scaling_factor: 0.1
  • Alternatively, velocity and acceleration scaling factors can be updated in the Rviz Motion Planning plugin before planning a path.

Run the pick and place demo

The goal of the demo is to pick and place a fresco fragment. There are 2 versions of the demo. The manual demo is moving to fixed poses while the moveit demo is using a perception pipeline to determine a grasp pose. Both demos can be run in Gazebo or with the real robot. For this purpose the gazebo argument has to be set accordingly. The side argument defines whether the left or the right hand is used to grasp the fragment.

Without perception

roslaunch repair_interface manual_test.launch side:=right gazebo:=false

With perception

First you have to move the arms and hands out of the field of view of the torso camera. Afterwards you can execute the following script.

roslaunch repair_interface moveit_test.launch side:=right gazebo:=false

In the beginning two windows will pop up which you have to close by pressing the q button.

Int Week 2 partial update

First terminal

roslaunch repair_gazebo bringup_moveit.launch launch_gazebo:=true sh_version:=v1_2_research fixed_hands:=false

Second terminal

rosrun repair_interface moveit_client.py _use_gazebo:=true

Recognition

rosrun repair_interface moveit_multi_fresco_with_recognition.py _side:=right _gazebo:=true

To run recognition, a few files need to be added (ask Luca Palmieri for the files):

  • RPf_00123 to RPf_001266 should be added to repair_ros_robot/repair_urdf/sdf
  • RPf_00123 to RPf_001266 should also be added to /home/.gazebo/models
  • The fragment database directory fragments_db should be added to /home/.gazebo

Information about used topics

  • To inspect all the topics exposed by xbot2 run rostopic list:
  • Send commands to the joints (SoftHand excluded) using /xbotcore/command topic
  • Read joint states (SoftHand excluded) using /xbotcore/joint_states topic
  • Send commands to the SoftHans using /{left/right}_hand_v1s/synergy_command topic, or inspect the state of each finger looking at /{left/right}_hand_v1s/{fingername}_state topic

4) Known Issues

  • The URDF does not reflect the real robot yet.
  • The planning parameters have to be optimized so that the robot does not stop between subsequent poses.

5) Relevant publications

T.B.A.