This ROS package provides a simulation of the picking challenge for the 1st year group project in the FARSCOPE Centre for Doctoral Training at Bristol Robotics Laboratory. It includes:
- a simulated environment with targets to be picked from shelves
- a model of FARSCOPE's mobile manipulator robot, the MMO-700 from Neobotix, comprising a UR10 arm on an omnidirectional mobile base.
- a simple gripper model
- a simple camera model
- example control scripts and launch files The simulation is implemented in Gazebo.
The package was developed in ROS Melodic using Gazebo 9.0.0 on Ubuntu 18.04. It has also been tested on ROS Noetic. It requires the following packages:
- neo_simulation : simulation of the MMO-700 from Neobotix
- universal_robot : the UR10 support from ROS Industrial
- joint_state_publisher_gui : the GUI for generating fake joint states, needed for visualizing the robot
- ros_controllers : standard controllers for the simulated robot
Note: there is also an indirect dependency on MoveIt, a ROS motion planner which is required by the
universal_robot
package. You might end up using it, or you might not, so options for a workaround or install MoveIt are given below.
Note: there are different ROS packages for the UR10 arm depending on what firmware is installed. This simulation does not guarantee compatibility with the real FARSCOPE arms as that has yet to be tested.
Note: this package is relatively simple but has not been tested with other versions of ROS and Gazebo.
- If you don't have it already, install ROS Melodic and set up a workspace using these instructions
- Install the
joint_state_publisher_gui
usingsudo apt install ros-melodic-joint-state-publisher-gui
. (Replacemelodic
with your distribution if required.) - Install
ros_controllers
usingsudo apt install ros-melodic-ros-controllers
. - Clone this package, neo_simulation and universal_robot into the workspace
src
directory. - Navigate up to the root directory of your ROS workspace (
cd ..
fromsrc
) and runcatkin_make
.
If you get an error saying "moveit_core" not found, either re-run as
catkin_make -DCATKIN_BLACKLIST_PACKAGES="ur_kinematics"
or install the missing component usingsudo apt install ros-melodic-moveit
(or another distro instead of melodic).
- Run
roslaunch farscope_group_project farscope_example_robot_visualize.launch
. You should see an RViz visualization of the robot. - Run
roslaunch farscope_group_project farscope_example_robot_simulate.launch
. You should see a Gazebo simulation of the robot picking a target of a shelf and dropping it. - Run
roslaunch farscope_group_project farscope_example_robot_simulate.launch use_gui:=false use_rviz:=true
. Now Gazebo will run headless (i.e. no graphics front end) but you can see some of what's happening in RViz, including the LIDAR, the camera views, and the robot pose.
This section describes the ROS interface started by the launch/example_robot/farscope_example_robot_simulate.launch
file. Unless otherwise stated, these are all published or subscribed to by the Gazebo node, as a result of plugins enabled in the robot_description
URDF.
Also included in that file is the node
scripts/example/test_pickup.py
which shows examples of how to use the control interface.
Note: there are other topics available in the
\gazebo
namespace that are not shown in the diagram above. These must not be used as part of your final submission. They contain direct information feeds from the simulator and are essentially cheating the problem. You can, however, use them for development, if it serves a purpose.
(incomplete list)
- lidar_scan : sensor_msgs/LaserScan : output of the front-mounted LIDAR on the mobile base
- camera1/image_raw : sensor_msgs/Image : output of the forearm camera
- camera2/image_raw : sensor_msgs/Image : output of the upper arm camera
- joint_states : sensor_msgs/JointState : states of all robot joints, including arm and gripper
- odom : nav_msgs/Odometry : odometry from the mobile robot base
(incomplete list)
- cmd_vel : geometry_msgs/Twist : command to move robot base
- finger1_controller/command : std_msgs/Float64 : position command for gripper finger 1
- finger2_controller/command : std_msgs/Float64 : position command for gripper finger 2
See here for information on using ROS actions
- arm_controller/follow_joint_trajectory : control_msgs/FollowJointTrajectoryAction : controls the movement of the UR10 arm
- robot_description : the URDF model of the robot
- target_description : URDF model of an individual pick-up target
- scenario : data on the locations of the targets in the world (see
scenario_all.yaml
for an example of the internal format)
The robot is represented in Unified Robot Description Format (URDF), generated using Xacro XML macros for flexibility. The URDF includes <gazebo>
tags to encode robot actuation and sensing. You are provided with a fully implemented model of a suitable robot, including a simple two finger gripper, two cameras, and a scanning LIDAR sensor on a mobile base. Your options for customizing the robot include:
- Using the integrated example robot provided in
models/example_robot/farscope_example_robot.urdf.xacro
as is - Making your own modified version of
models/example_robot/farscope_example_robot.urdf.xacro
to move, add or remove cameras - Replacing the simple gripper with something of your own design including custom CAD (see URDF tutorials) and actuation (warning: custom actuation is difficult and requires learning ros_control)
- Making a complete new robot from scratch (not recommended: lots of ROS detail to learn)
package.xml : defines package data such as dependencies and authorship, for ROS
CMakeLists.txt : used by the catkin_make
setup.py : defines the Python modules that are shared from this package
aws_notes.md : incomplete discussions on getting this working on AWS Robomaker
./worlds:
farscope_test_2.world : tells Gazebo where to put the shelves and cones
./src/farscope_group_project:
farscope_robot_utils.py : Python library providing interfaces for the arm, base and gripper (hiding the ROS away, optionally)
./models/example_robot:
farscope_example_robot.urdf.xacro : robot model combining the mobile base, UR10 arm, and gripper, to be used as example for customization
./models/mobile_arm:
mobile_arm.urdf.xacro : provides a xacro 'macro' to include the mobile manipulator in other URDF/xacro models
./models/camera:
simple_camera.urdf.xacro : xacro macro for including camera (physical model and gazebo functionality) in other URDF/xacro models
./models/target:
target.urdf : model of the target or "trophy"
target.stl : CAD mesh of the trophy
./models/gripper:
simple_gripper.urdf.xacro : xacro macro for the simple gripper, to be included in integrated models
./scripts/example:
move_gripper.py : utility script for manual operation of the gripper
move_base.py : utility script for manual operation of the base
move_arm.py : utility script for manual operation of the base
test_pickup.py : example of an integrated robot controller that does just one pickup in very simplistic, open-loop way
./scripts/setup:
spawn_targets.py : script that spawns targets, including randomization, to set up the challenge scenario
./controller:
gripper.yaml : definition of the controllers used in Gazebo to simulate feedback control of the simple gripper
./scenarios: : these files tell the spawn_targets.py script where to put trophies and with what level of randomization
typical.yaml : default scenario with random misplacements, omissions and duplicates
all_no_random.yaml : scenario with one target placed exactly in the centre of every shelf
./launch/common:
ur10_controllers.launch : loads the parameters for the UR10 controller, as simulated by the Gazebo plug-ins
gripper_controller.launch : loads the parameters for the gripper controller
./launch/example_robot:
farscope_example_robot_simulate.launch : simulates the example robot and the test_pickup.py example controller
simulate.rviz : config for RViz that shows sensor outputs
farscope_example_robot_visualize.launch : displays the example robot model in RViz and lets you manipulate its joints
visualize.rviz : config for RViz to show robot model
farscope_example_robot_control.launch : (for AWS use) just the robot controller parts of the farscope_example_robot_simulate
farscope_example_robot_simulator_only.launch : (for AWS use) just the simulator parts of the farscope_example_robot_simulate
./launch/challenge:
farscope_group_challenge_gazebo.launch : standard parts of all challenge simulations, including target spawning and Gazebo with the shelves and cones
The goal of the group project is to experience integration of a robotic system, not to perform research-grade innovation in any individual subsystem. Therefore the challenge has been subtly modified to simplify some elements. In particular, the target object has been made easy to grasp, with a handy lip at the top to avoid the need for friction gripping. Also, it has been made easy to detect, with a contrasting colour and symmetrical appearance.
Physics simulation is hard. Even with Gazebo, it would take considerable tuning to represent realistic grasping, contact and friction. Be ready for some odd behaviour:
- The robot gently rotates even while it is commanded to stay still
- If you drive the robot into a shelf, the robot will probably flip itself over, but the shelf will be undamaged
- A target may spontaneously jump from your grasp
- A target may mysteriously cling to you when released These are things you will just have to work around in the pursuit of your project.