ROS package to detect overall users engagement from a robot's ego camera in human-robot interactions.
The approach is detailed in the journal paper: Del Duchetto F, Baxter P and Hanheide M (2020) Are You Still With Me? Continuous Engagement Assessment From a Robot's Point of View. Front. Robot. AI 7:116. doi: 10.3389/frobt.2020.00116
-
Install python catkin util:
pip install catkin_pkg
-
In a terminal go into the root folder of the package:
cd engagement_detector/
-
and install:
pip install .
-
Then download the keras model of the newtork:
./download_model.sh
-
Now, you can build the package in your catkin workspace (i.e. http://wiki.ros.org/catkin/Tutorials/create_a_workspace).
roslaunch engagement_detector engagement_detector.launch
The predicted engagment value is published on the topic /engagement_detector/value
.
image
: (default/camera/color/image_raw
) input imagedebug_image
: (default:true
) whether to publish the out debug imageout_image
: (default:/engagement_detector/out_image
) the debug image topic
rosrun image_view image_view image:=/engagement_detector/out_image
will show the camera image with the engagement serie data plotted on it. Something like this:
Single user example | Multi users example |
---|---|
On a GeForce GTX 1060 6GB
GPU the engagement value is published at a rate of about 5 hz.