Note: This is a cleaned-up, PyTorch port of the GG-CNN code. For the original Keras implementation, see the RSS2018
branch.
Main changes are major code clean-ups and documentation, an improved GG-CNN2 model, ability to use the Jacquard dataset and simpler evaluation.
The GG-CNN is a lightweight, fully-convolutional network which predicts the quality and pose of antipodal grasps at every pixel in an input depth image. The lightweight and single-pass generative nature of GG-CNN allows for fast execution and closed-loop control, enabling accurate grasping in dynamic environments where objects are moved during the grasp attempt.
This repository contains the implementation of the Generative Grasping Convolutional Neural Network (GG-CNN) from the paper:
Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach
Douglas Morrison, Peter Corke, Jürgen Leitner
Robotics: Science and Systems (RSS) 2018
If you use this work, please cite:
@inproceedings{morrison2018closing,
title={{Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach}},
author={Morrison, Douglas and Corke, Peter and Leitner, J\"urgen},
booktitle={Proc.\ of Robotics: Science and Systems (RSS)},
year={2018}
}
Contact
Any questions or comments contact Doug Morrison.
This code was developed with Python 3.6 on Ubuntu 16.04. Python requirements can installed by:
pip install -r requirements.txt
Currently, both the Cornell Grasping Dataset and Jacquard Dataset are supported.
- Download the and extract Cornell Grasping Dataset.
- Convert the PCD files to depth images by running
python -m utils.dataset_processing.generate_cornell_depth <Path To Dataset>
- Download and extract the Jacquard Dataset.
Coming Soon
Training is done by the train_ggcnn.py
script. Run train_ggcnn.py --help
to see a full list of options, such as dataset augmentation and validation options.
Some basic examples:
# Train GG-CNN on Cornell Dataset
python train_ggcnn.py --description training_example --network ggcnn --dataset cornell --dataset-path <Path To Dataset>
# Train GG-CNN2 on Jacquard Datset
python train_ggcnn.py --description training_example2 --network ggcnn2 --dataset jacquard --dataset-path <Path To Dataset>
Trained models are saved in output/models
by default, with the validation score appended.
Evaluation or visualisation of the trained networks are done using the eval_ggcnn.py
script. Run eval_ggcnn.py --help
for a full set of options.
Important flags are:
--iou-eval
to evaluate using the IoU between grasping rectangles metric.--jacquard-output
to generate output files in the format required for simulated testing against the Jacquard dataset.--vis
to plot the network output and predicted grasping rectangles.
For example:
python eval_ggcnn.py --network <Path to Trained Network> --dataset jacquard --dataset-path <Path to Dataset> --jacquard-output --iou-eval
Our ROS implementation for running the grasping system see https://github.com/dougsm/mvp_grasp.
The original implementation for running experiments on a Kinva Mico arm can be found in the repository https://github.com/dougsm/ggcnn_kinova_grasping.