This is a ROS package presents a perception-to-manipulation system for picking thin objects from clutter. The manipulation consists of four steps: detecting the object of instance, approaching the target overhead, descending till the fingertip contacts the object, tilting to adjust the final grasp pose. Object detection is implemented on Mask R-CNN, a deep neural network for instance segmentation, while descending and tilting are implemented with tactile sensors.
- Universal Robot UR10
- Robotiq 2F-140 Robotiq 140mm Adaptive parallel-jaw gripper
- Robotiq FT300 Force Torque Sensor
- Realsense SR300 camera
- Our package was developed in Ubuntu 16.04 and ROS Kinetic.
- urx:Python library for UR10 robot control.
- Robotiq ROS package: ROS driver for Robotiq adaptive gripper and force torque sensor.
- Realsense ROS Wrapper: ROS wrapper for Realsense SR300.
- AprilTag ROS Wrapper: ROS wrapper for apriltag library.
- Mask R-CNN: Framework for Object Detection and Segmentation
In your catkin workspace:
cd ~/catkin_ws/src
git clone https://github.com/oliviaHKUST/pickpack.git
cd ..
catkin_make
-
Follow the tutorial in Universal Robot package for ROS Kinetic and Robotiq ROS package to set up hardware properly.
-
Run Realsense SR300 camera in ROS. See link.
-
Run apriltag_ros package. See (https://github.com/AprilRobotics/apriltag_ros).
-
Run tilt and pivot script:
roscd pickpack/srcipts
jupyter notebook
Open
froce_yakeli_tilt_pivot.ipynb
-
Open a terminal, run object detection:
cd samples
jupyter notebook
Open
instance_segmentation.ipynb
. For loadingBLISTER_MODEL_PATH
, please refer to here. -
Open another terminal, run manipulation:
cd scripts
jupyter notebook
Open
thin_object_bin_pick_mani.ipynb
Qianyi Xu(qxuaj@connect.ust.hk), Zhekai Tong (ztong@connect.ust.hk) and Tierui He (theae@connect.ust.hk)