/ros_object_analytics

Primary LanguageC++Apache License 2.0Apache-2.0

ros_object_analytics

Object Analytics (OA) is ROS wrapper for realtime object detection, localization and tracking. These packages aim to provide real-time object analyses over RGB-D camera inputs, enabling ROS developer to easily create amazing robotics advanced features, like intelligent collision avoidance and semantic SLAM. It consumes sensor_msgs::PointClould2 data delivered by RGB-D camera, publishing topics on object detection, object tracking, and object localization in 3D camera coordination system.

OA keeps integrating with various "state-of-the-art" algorithms.

compiling dependencies

ROS packages from ros-kinetic-desktop-full

  • roscpp
  • nodelet
  • std_msgs
  • sensor_msgs
  • geometry_msgs
  • dynamic_reconfigure
  • pcl_conversions
  • cv_bridge
  • libpcl-all
  • libpcl-all-dev
  • ros-kinetic-opencv3

Other ROS packages

NOTE: OA depends on tracking feature from OpenCV (3.3 preferred, 3.2 minimum). The tracking feature is recently provided by ROS Kinetic package "ros-kinetic-opencv3" (where OpenCV 3.3.1 is integrated). However, if you're using an old version of ROS Kinetic (where OpenCV 3.2 is integrated), tracking feature is not provided. In such case you need self-build tracking from opencv_contrib. It is important to keep opencv_contrib (self-built) and opencv (ROS Kinetic provided) in the same OpenCV version that can be checked from "/opt/ros/kinetic/share/opencv3/package.xml"

build and test

  • to build
cd ${ros_ws} # "ros_ws" is the catkin workspace root directory where this project is placed in
catkin_make
  • to test
catkin_make run_tests
  • to install
catkin_make install

extra running dependencies

RGB-D camera

roslaunch realsense_ros_camera rs_rgbd.launch
roslaunch openni_launch openni.launch
roslaunch astra_launch astra.launch

command to launch object_analytics

  • launch with OpenCL caffe as detection backend

    roslaunch object_analytics_launch analytics_opencl_caffe.launch
  • launch with Movidius NCS as detection backend

    roslaunch object_analytics_launch analytics_movidius_ncs.launch

    Frequently used options

    • input_points Specify arg "input_points" for the name of the topic publishing the sensor_msgs::PointCloud2 messages by RGB-D camera. Default is "/camera/depth_registered/points" (topic compliant with ROS OpenNI launch)
    • aging_th Specifiy tracking aging threshold, number of frames since last detection to deactivate the tracking. Default is 16.
    • probability_th Specify the probability threshold for tracking object. Default is "0.5".
    roslaunch object_analytics_launch analytics_movidius_ncs.launch aging_th:=30 probability_th:="0.3"

published topics

object_analytics/rgb (sensor_msgs::Image)

object_analytics/pointcloud (sensor_msgs::PointCloud2)

object_analytics/localization (object_analytics_msgs::ObjectsInBoxes3D)

object_analytics/tracking (object_analytics_msgs::TrackedObjects)

object_analytics/detection (object_msgs::ObjectsInBoxes)

KPI of differnt detection backends

topic fps latency sec
OpenCL Caffe
localization 6.63 0.23
detection 8.88 0.17
tracking 12.15 0.33
Movidius NCS
localization 7.44 0.21
detection 10.5 0.15
tracking 13.85 0.24
  • CNN model of Movidius NCS is MobileNet
  • Hardware: Intel(R) Xeon(R) CPU E3-1275 v5 @3.60GHz, 32GB RAM, Intel(R) RealSense R45

visualize tracking and localization results on RViz

Steps to enable visualization on RViz are as following

roslaunch object_analytics_visualization rviz.launch
ROS 2 Object Analytics: https://github.com/intel/ros2_object_analytics
Any security issue should be reported using process at https://01.org/security