S-PTAM is a Stereo SLAM system able to compute the camera trajectory in real-time. It heavily exploits the parallel nature of the SLAM problem, separating the time-constrained pose estimation from less pressing matters such as map building and refinement tasks. On the other hand, the stereo setting allows to reconstruct a metric 3D map for each frame of stereo images, improving the accuracy of the mapping process with respect to monocular SLAM and avoiding the well-known bootstrapping problem. Also, the real scale of the environment is an essential feature for robots which have to interact with their surrounding workspace.
(Click the image to redirect to S-PTAM video)
[1] Taihú Pire, Thomas Fischer, Javier Civera, Pablo De Cristóforis and Julio Jacobo Berlles.
Stereo Parallel Tracking and Mapping for Robot Localization
Proc. of The International Conference on Intelligent Robots and Systems (IROS) (Accepted), Hamburg, Germany, 2015.
S-PTAM is released under GPLv3 license.
For a closed-source version of S-PTAM for commercial purposes, please contact the authors.
If you use S-PTAM in an academic work, please cite:
@inproceedings{pireIROS15,
title={{Stereo Parallel Tracking and Mapping for robot localization}},
author={Pire, Taih{'u} and Fischer, Thomas and Civera, Javier and De Crist{'oforis}, Pablo and Jacobo berlles, Julio},
booktitle={Proc. of The International Conference on Intelligent Robots and Systems (IROS) (Accepted)},
year={2015}
}
This site and the code provided here are under active development. Even though we try to only release working high quality code, this version might still contain some issues. Please use it with caution.
We have tested S-PTAM in Ubuntu 14.04 with ROS Indigo.
To install ROS (indigo) use the following command:
sudo apt-get install ros-indigo-desktop
Install g2o library from the source code provided in
svn co https://svn.openslam.org/data/svn/g2o
Note: the g2o ROS package ros-indigo-libg2o
in ubuntu 14.04 reduces notoriusly the performance of S-PTAM.
sudo apt-get install ros-indigo-pcl-ros
git clone git@github.com:lrse/sptam.git
catkin_make --pkg sptam -DSHOW_TRACKED_FRAMES=ON
SHOW_TRACKED_FRAMES=([ON|OFF], default: OFF)
Show the tracked frames by S-PTAM. Set it OFF to improve S-PTAM performance.
We provide some examples of how to run S-PTAM with the most popular stereo datasets
-
Download the KITTI rosbag kitti_00.bag provided in KITTI rosbag files
-
Uncompress the dataset
rosbag decompress kitti_00.bag
-
Set use_sim_time ros variable true
rosparam set use_sim_time true
-
Play the dataset
rosbag play --clock kitti_00.bag
(When S-PTAM run with the flag SHOW_TRACKED_FRAMES=ON the performance is reduced notoriusly).
-
Run sptam using the kitti.launch
roslaunch sptam kitti.launch
-
Download the MIT Stata Center rosbag 2012-01-27-07-37-01.bag provided in MIT Stata Center Web Page
-
Set use_sim_time ros variable true
rosparam set use_sim_time true
-
Play the dataset
rosbag play --clock 2012-01-27-07-37-01.bag -s 302.5 -u 87
(Here we are running the part 3 of the sequence where ground-truth was provided that is why the bag file start from a different timestamp)
-
Run sptam using the mit.launch
roslaunch sptam mit.launch
-
Download the Level7 rosbag level07_20_05_12_trunc.bag (3747 Frame Subset) provided in Indoor Level 7 S-Block Dataset Web Page
-
Set use_sim_time ros variable true
rosparam set use_sim_time true
-
Play the dataset
rosbag play --clock level7_truncated.bag
-
Run sptam using the level7.launch
roslaunch sptam level7.launch
Camera topics should provide undistorted and stereo-rectified images. Consider using the image_proc node.
/stereo/left/image_rect (sensor_msgs/Image)
Undistorted and stereo-rectified image stream from the left camera.
/stereo/left/camera_info (sensor_msgs/CameraInfo)
Left camera metadata.
/stereo/right/image_rect (sensor_msgs/Image)
Undistorted and stereo-rectified image stream from the right camera.
/stereo/right/camera_info (sensor_msgs/CameraInfo)
Right camera metadata.
point_cloud (sensor_msgs/PointCloud2)
Sparse mapped point cloud used for tracking.
camera_pose (geometry_msgs::PoseStamped)
Current camera pose computed after tracking.
~use_odometry (bool, default: false)
Replace decay velocity motion model by odometry.
~odom_frame (string, default: "odom")
Reference frame for odometry messages, if used.
~base_link_frame: (string, default: "base_link")
Reference frame for the robot base.
~camera_frame: (string, default: "camera")
Reference frame for the left camera, used to get left camera pose from tf.
~approximate_sync: (bool, default: false)
Whether to use approximate synchronization. Set to true if the left and right Cameras do not produce exactly synced timestamps.
~FeatureDetector/Name: (string, default: "GFTT")
Follows OpenCV convention.
~DescriptorExtractor/Name: (string, default: "BRIEF")
Follows OpenCV convention.
~DescriptorMatcher/Name: 'BruteForce-Hamming'
Follows OpenCV convention.
~DescriptorMatcher/crossCheck: false
Follows OpenCV convention.
~MatchingCellSize: (int, default: 15)
To match the map points with images features, each frame is divided in squares cells of fixed size. MatchingCellSize define the size of each cell.
~MatchingNeighborhood: (int, default: 1)
Cells' neighborhood around the point
~MatchingDistance: (double, default: 25.0)
Descriptor distance. Use a non-fractional value when hamming distance is used. Use Fractional value when L1/L2 norm is used.
~EpipolarDistance: (double, default: 0.0)
Distance in rows from the epipolar line used to find stereo matches.
~FrustumNearPlaneDist: (double, default: 0.1)
Frustum (Field of View) near plane.
~FrustumFarPlaneDist: (double, default: 1000.0)
Frustum (Field of View) far plane.