/leap_teleop

A modular ROS platform for gesture-based teleoperated grasping with the KUKA iiwa serial robots and the ReFlex TakkTile robotic hand.

Primary LanguageMATLABOtherNOASSERTION

Gesture-Based Teleoperated Grasping

Grasping Demo

This is the offical code repository for the publication "Gesture-Based Teleoperated Grasping for Educational Robotics" in the IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2021. We present an interactive robotic platform for teleoperated grasping as an educational tool. Our teleoperation method uses the Leap Motion optical gesture tracker to simultaneously control each of the four degrees-of-freedom (DOF) of the ReFlex TakkTile robotic hand and the six-DOF tool pose of the KUKA iiwa (7 R800 or 14 R820) serial manipulator.

For more details on the control algorithm please refer to the paper or the handbook. Watch a video of the system in action here.

Citation

@INPROCEEDINGS{koenig2021gesture,
  author={Koenig, Alexander and Rodriguez y Baena, Ferdinando and Secoli, Riccardo},
  booktitle={IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2021}, 
  title={Gesture-Based Teleoperated Grasping for Educational Robotics}, 
  year={2021}
}

Folder Structure

Folder Name Contents
cad CAD files of 3D printed parts
common library of common functions used by leap_hand and leap_kuka
docs project documentation
grasping code for autonomous grasping prediction (work in progress)
kuka code for controlling the KUKA robot
leap_hand code to interface between Leap Motion node and robotic hand node
leap_kuka code to interface between Leap Motion node and KUKA node
leap_rig files to start Leap Motion control of KUKA and robotic hand
modules third party software (generated upon recursive clone)
vision all vision related code (camera and virtual reality)

The directories vision and grasping were not discussed in our paper, but they are extensions of the project that can be pursued in future work.

Hardware Components

  • KUKA iiwa 7 R800 (or 14 R820)
  • ReFlex TakkTile Robotic Hand
  • Leap Motion
  • Oculus Rift DK2
  • Asus Xtion Pro Live
  • Switch
  • 3 ethernet cables (category 5 or higher)

Software Components

The system was tested using the following software.

  • Ubuntu Xenial 16.04 LTS
  • Python 2.7
  • numpy 1.11.0
  • MATLAB Version 9.6 (R2019a)
  • MATLAB Instrument Control Toolbox Version 4.0 (R2019a)
  • MATLAB Robotics System Toolbox Version 2.2 (R2019a)
  • MATLAB Robotics System Toolbox Interface for ROS Custom Messages
  • ROS Kinetic (Desktop Install recommended)
  • KUKA MatlabToolboxServer on KUKA Robot Controller. See user guide of KUKA Sunrise Toolbox
  • Only for grasp prediction: CUDA V10.1.168 (currently not needed)
  • Only for grasp prediction: Conda 4.6.14 (installed miniconda2)
  • Only for grasp prediction: Conda environment with dependencies in grasping/env/environment.yaml

Installation

  1. Install all of the above software components.
  2. Initialize a clean catkin workspace. Open a terminal and type
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
catkin_init_workspace
  1. Clone this repository with all its submodules in the src folder of your workspace
git clone --recursive https://github.com/axkoenig/leap_teleop.git
  1. Install all dependencies. You might need to setup rosdep first
rosdep update
rosdep install --from-paths src --ignore-src -r -y 
  1. Build workspace with catkin build
cd ~/catkin_ws
catkin build 
  1. Remember to source the setup.bash file and, if you like, add it to your .bashrc
source ~/catkin_ws/devel/setup.bash
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
  1. Call rosgenmsg("~/catkin_ws/src/kuka/kuka_msgs") in Matlab command window and follow onscreen instructions. See further instructions here.

Networking Setup

  • ReFlex TakkTile Robotic Hand (Address: 10.1.1.10, Netmask: 254.0.0.0, Gateway: 0.0.0.0, check the "Require IPv4 addressing" button). Also make sure your ethernet connection shows up as "eth0". If it does not this link might help. Please view the Robotic Hand's documentation for more details. In their instructions you can skip all sections on cloning and building the drivers as they will already be included in this package.
  • KUKA Robot (in our case, Address: 172.31.1.55, Netmask: 16, Gateway: 172.31.1.110).

Further Steps

  • To run the grasp prediction script it is highly recommended to create a conda environment first. You can use the included environment.yaml file.
conda env create -f ~/catkin_ws/src/grasping/env/environment.yml
conda env list
  • To use the grasp prediction script download the a pretrained neural network using the download_pretrained_ggcnn.sh shell script.
  • Perform camera calibration (RGB and depth recommended) before using scripts that use the RGBD camera (especially AprilTag detection). Use the camera_calibration node contained in the image_pipeline module which will directly save the calibration parameters to ~/.ros/camera_info. A backup of the calibration results of the camera used in this project can be found in the vision/calibration folder. Example commands to calibrate rgb and depth camera.
rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.0245 image:=/camera/rgb/image_raw camera:=/camera/rgb
rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.0245 image:=/camera/ir/image camera:=/camera/ir 
  • When printing the universal flange connector make sure to stop the 3D printer at the right layer to insert the nuts.

Acknowledgments

  • Dr. Riccardo Secoli for providing scripts for the control of the Robotic Hand
  • M. Safeea and P. Neto, "KUKA Sunrise Toolbox: Interfacing Collaborative Robots With MATLAB," in IEEE Robotics & Automation Magazine, vol. 26, no. 1, pp. 91-96, March 2019.
  • D. Morrison, P. Corke and J. Leitner, "Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach" in Robotics: Science and Systems (RSS), 2018.
  • Maintainers of third party repositories