MAE-6592 final class project
sudo apt update
sudo apt-get install ros-melodic-teleop-twist-keyboard ros-melodic-gazebo-ros-pkgs ros-melodic-gazebo-ros-control
See http://gazebosim.org/tutorials/?tut=ros_depth_camera for full installation tutorial.
To install the ROS nodes clone this repo into your catkin workspace:
cd /home/$USER/catkin_ws/src/
git clone https://github.com/km5es/Robotic-Autonomy-Project.git
cd ..
catkin_make
NOTE: In case it is not already so, make the human_detector node executable by:
chmod +x src/Robotic-Autonomy-Project/human_detector/scripts/human_detector.py
roslaunch kbot_description kbot_base_rviz_gazebo.launch
roslaunch kbot_simple_control kbot_control_teleop.launch
Optional: Set Fixed Frame to 'camera_link' and add a PointCloud2 and Image with Topics '/camera/depth/points' and '/camera/color/image_raw' respectively in the open Rviz window.
rosrun human_detector human_detector.py
The human truth data will be saved in /home/$USER/track_human.csv
The robot truth data will be saved in /home/$USER/track_robot.csv
EKF data will be dumped in /home/$USER/track_EKF.csv
The data folder contains some example data and a video.
If the robot gets stuck in a wall there is an explicit teleop launch to get it out:
roslaunch kbot_simple_control follower_control_teleop.launch