/easy-kinesthetic-recording

A package with all scripts and commands needed to record joint and ee trajectories (and more) from mutliple robots for kinesthetic teaching.

Primary LanguagePythonMIT LicenseMIT

easy-kinesthetic-recording

Scripts and instructions to easily record data from kinesthetic demonstrations as rosbags and convert to matlab (or python - experimental) using the Franka Panda Emika Robot.


Installation

Dependencies
franka_interactive_controllers
record_ros
rosbag_to_mat (Working - If you want to export data to MATLAB)
bagpy (Experimental - If you want to export data to Python)

To automagically install dependencies do the following steps:

  • In your catkin src directory clone the repository
$ git clone -b latest-franka https://github.com/nbfigueroa/easy-kinesthetic-recording.git
  • wstool gets all other git repository dependencies, after the following steps you should see extra catkin packages in your src directory.
$  wstool init
$  wstool merge easy-kinesthetic-recording/dependencies.rosinstall 
$  wstool up 
  • Query and installs all libraries and packages
$ rosdep install --from-paths . --ignore-src --rosdistro noetic 

Step 1: Recording Kinesthetic Demonstrations as ROSBags

Bringup Kinesthetic Teaching Pipeline

Run Franka-ROS-Kinesthetic Controller

Here we assume you have installed the franka_interactive_controllers package and know how to use it.

In two terminals you should launch the following:

roslaunch franka_interactive_controllers franka_interactive_bringup.launch
roslaunch franka_interactive_controllers joint_gravity_compensation_controller.launch
Run Topic Recorder

In the launch file launch/franka_record_demonstrations.launch you can define the topics that you wish to record in the following argument.

	<arg name="topic"  	    
		default="/tf 
		/franka_state_controller/joint_states 
		/franka_state_controller/F_ext 
		/franka_state_controller/O_T_EE 
		/franka_state_controller/O_T_FL 
		/franka_gripper/joint_states"/>	

You must also define the path to the directory where all bags will be recorded and the bag prefix-:

<arg name="path_save"      default="/home/panda2/rosbag_recordings/cooking/"/>
<arg name="file_name"  	   default="demo"/>

Once you've done this, you can run the following launch file:

roslaunch easy_kinesthetic_recording franka_record_demonstrations.launch

Alternatively, you can launch the following launch file from franka_interactive_controllers that will bringup both the joint gravity compensation controllers and the topic recording launch file:

roslaunch franka_interactive_controllers franka_kinesthetic_teaching.launch

You should now see the following displayed in your screen (without the trajectories):

NOTE: If you run this script and the robot moves by itself, that means that your external_tool_compensation forces are incorrect. See external_tool_compensation instructions to correct it.

Record Demonstrations as ROSbags

To record/stop a rosbag recording you can either do it by:

  • Pressing the buttons on the GUI as shown in the image above
  • Type the following in a terminal
 rosservice call /record/cmd "cmd: 'record/stop'"

Replaying a recorded demonstration

You can replay the recorded demonstrations by running the following commands:

Visualization
roslaunch easy_kinesthetic_recording franka_replay_bag_demonstrations.launch
Play bag
$ rosbag play *.bag

If the following variables are set to true:

  • <arg name="viz_traj" default="true"/>
  • <arg name="viz_obj" default="true"/>

You can see the trajectories being replayed with the franka, the gripper will not be shown and you might see some erros in rviz, but that's fine: The green block represent the state of the gripper:

  • green: an object is grasped
  • gray: no object is grasped

See examples below.

Examples

This code together with franka_interactive_controllers has been used for two household tasks:

  • cooking preparation task: scooping and mixing ingredients from bowls

Left: Video of kinesthetic demonstration, Right: Visualization of recorded trajectories by replaying recorded rosbag

  • table setting task: grasping plates/cutlery from dish rack and placing it on a table.

Left: Video of kinesthetic demonstration, Right: Visualization of recorded trajectories by replaying recorded rosbag


Step 2: Extracting Trajectories from ROSBag Data for Task Learning

Extracting ROSBag Data to MATLAB (Working)

To export the data recorded in the rosbags to MATLAB you can use the rosbag_to_mat package. Follow the instructions in the README file to extract data for the following tasks:

  • cooking preparation task: raw trajectories from demonstrations (colors indicate continuous demonstration):

  • table setting task: raw trajectories from demonstrations (colors indicate continuous demonstration):

Extracting ROSBag Data to Python (Experimental)

This functionality hasn't been tested yet but I suggest to try out the bagpy: a python package provides specialized class bagreader to read and decode ROS messages from bagfiles in just a few lines of code.


Step 3: Trajectory Segmentation of Multi-Step Tasks for Motion Policy Learning

If the trajectories are continuous demonstrations of a multi-step task which will be represented as a sequence of goal-oriented motion policies, then the trajectories must be segmented.

  • cooking preparation task: segmented and processed trajectories from demonstrations (colors indicate trajectory clusters), see README for segmentation algorithm details:

  • table setting task: segmented and processed trajectories from demonstrations (colors indicate trajectory clusters), see README for segmentation details:


Contact

Nadia Figueroa (nadiafig AT seas dot upenn dot edu)