/teleopros2

Provide teleop (video streaming, desktop control of robot) using WebRTC over ROS2

Primary LanguagePythonMIT LicenseMIT

TeleOp on ROS2 using WebRTC

for ROS2 alone, or via the NVidia docker setup for Jetson (Orin) Nano/x86

(Currently using ROS@2 Humble, that seems to be the currently supported version by NVidia.)

The use case is for teleoperation of a ROS2 robot, providing:

  • teleop control:
    • view image from robot over internet (efficiently using WebRTC)
    • control robot, using either controls on screen of tilt on mobile devices
  • targeted at the NVidia Jetson Orin Nano stack, but not absolutely required.

TeloOp Screenshots

This is based on aiortc; jetbot-ros2 for inspiration; and with special consideration to webrtc_ros, and others.

Please see the Medium article for a richer discussion. Basically this creates a browser based control panel leveraging aiortc for communication within the ROS2 environment.

Installation

Starting from scratch

There are two platforms you can (should) support:

  • NVidia Jetson Orin Nano (or other, non tested).
  • x86 platform - this allows the use of Isaac Sim for software-in-the-loop simulation.

While this works in a 'standard' ROS2 Humble environment, the supported approach follows the NVidia suggested approach of using a Docker environment. Follow:

Testing so far:

  • Docker environment launches successfully. Note that we customize this in the next step.
cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
  ./scripts/run_dev.sh

On to TeleOpROS2

  1. Clone this repository (on the host)
cd ${ISAAC_ROS_WS}/src
git clone git@github.com:pgaston/TeleOpROS2.git
  1. And, assuming you're using a Realsense camera:
realsense-viewer
  1. Copy all three of the files in the folder - this is for customizing the docker build process
${ISAAC_ROS_WS}/src/teleoprOS2/docker/.isaac_ros_common-config
${ISAAC_ROS_WS}/src/teleoprOS2/docker/.isaac_ros_dev-dockerargs
${ISAAC_ROS_WS}/src/teleoprOS2/docker/Dockerfile.teleopros2

to the folder

${ISAAC_ROS_WS}/src/isaac_ros_common/scripts

after this, run_dev.sh should work for our world. This is how to run the 'standard' NVidia docker environment, with my additions, per #2 above.

cd ${ISAAC_ROS_WS}/src/isaac_ros_common && \
  ./scripts/run_dev.sh
  1. Additional requirements include:
  1. Add SSL certificates. This is required for mobile. This is the default. To change the default set the 'ssl' parameter to false.
  • Create a 'certs' directory at the top level of teleopros2
  • Create a local server.certs and server.key file in this directory. Here is one approach. . Tip - don't add a passphrase.
  • (btw, I included a certs.zip that you can expand into a certs folder there. Not secure in the slightest - but you can use to test mobile/twisting. I can't include otherwise as it sets off a github security alert.)

Note - your browser will show this as insecure. Go to advanced / proceed anyway. Exercise for the user to do this 'correctly'.

  1. Build

Build (in the docker)

cd /workspaces/isaac_ros-dev
colcon build --symlink-install --packages-select teleopros2

Run to test. Once you click Connect the server will send video from the ROS2 source to the browser. The image shown in this case will indicate it is not receiving ROS2 image messages, since you aren't providing any, yet.

  1. Run/test with realsense camera
source install/setup.bash
ros2 run teleopros2 teleopros2_node

You can then browse to the following page with your browser:

http://127.0.0.1:8080 or, for the secure version (default) https://127.0.0.1:8080

If you have a realsense installed, run in another terminal (do the run_dev.sh thing to get into the same docker)

source install/setup.bash
ros2 launch realsense2_camera rs_launch.py

or, in a single launch file

ros2 launch teleopros2 teleRSCamera_launch.py

** voila - WebRTC showing your realsense image **

  1. Run/test with NVidia Isaac sim. Web page

Start Isaac sim per directions . BTW, I need to go to http://localhost:3080/ to restart all services (e.g., Nucleus).

./python.sh ${ISAAC_ROS_WS}/src/isaac_ros_nvblox/nvblox_examples/nvblox_isaac_sim/omniverse_scripts/start_isaac_sim.py --gpu_physics_enabled

This Isaac sim publishes the image on (at least my version) /image_raw so you may need to change the teleopros2.py code to subscribe to that (okay, this can be fixed so it's easier...)

to move the robot for testing purposes, a simple node is 'teleop_twist_keyboard'.

ros2 run teleop_twist_keyboard teleop_twist_keyboard

I also have a customization of the 'simple room' with a jetbot. This publishes images and accepts robot movement commands.

  1. Run/test with a GStreamer sourced camera, say a webcam or a CSI camera on the Jetson Orin Nano.

Adjust your 'gscam_config' string in launch/teleGSCamera_lannch.py. The current default is to use /dev/video6.

To run

ros2 launch teleopros2 teleGSCamera_launch.py

and again, on your browser go to https://localhost:8080/

For Jetbot...

  1. Now for a Jetbot! (ROS - but interesting github - https://github.com/issaiass/jetbot_diff_drive)

More ROS2 packages to install... sudo apt install ros-humble-ros2-control sudo apt install ros-humble-ros2-controllers

To do's (potentially)

  • Performance enhancement. Use gstreamer to also create a h.264 compressed stream that can be used instead of the image topic. This will utilize the GPU most effectively. See discussion
  • Use on a real Jetson Orin Nano robot. Turns out the original Nano is too old for the new NVidia docker setup, and the Orin Nano doesn't fit on, say the Jetbot. (Different power needs, ...) Still building...

Useful links

Other links

-linorobot project -jetbot-ros2 project

https://control.ros.org/humble/doc/getting_started/getting_started.html https://github.com/ros-controls/

clone humble branch follow instructions - https://control.ros.org/master/doc/ros2_control_demos/doc/index.html#build-from-debian-packages

VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV


cd ${ISAAC_ROS_WS}/src/isaac_ros_common &&
./scripts/run_dev.sh


==>

rm -rf install/ rm -rf build/

rm -rf src/ros2_control_demos

ackermann_msgs

==> (to verify it all builds, then run example 1 to ensure it 'works') https://github.com/ros-controls/ros2_control_demos/blob/master/example_1/doc/userdoc.rst

rm -rf install/ rm -rf build/

. /opt/ros/${ROS_DISTRO}/setup.sh colcon build --symlink-install

==> run an example tab 1:

source install/setup.bash ros2 launch ros2_control_demo_example_1 view_robot.launch.py tab 2: source /opt/ros/${ROS_DISTRO}/setup.bash ros2 run joint_state_publisher_gui joint_state_publisher_gui

--> one or both...

git clone https://github.com/ros-controls/ros2_control_demos -b humble

for now, throws error - missing file

#include "hardware_interface/lexical_casts.hpp"

so deleted examples 2, 8, 14

seems like the apt install sources haven't caught up yet w/ the github code

--> should fix itself over time

for build from source - btw, not working yet...

git clone https://github.com/ros-controls/ros2_control.git -b humble git clone https://github.com/ros-controls/ros2_controllers.git -b humble git clone git@github.com:ros-controls/control_msgs.git -b humble git clone git@github.com:ros-controls/realtime_tools.git

cd .. rosdep update --rosdistro=$ROS_DISTRO sudo apt update sudo apt upgrade rosdep install --from-paths ./ -i -y --rosdistro ${ROS_DISTRO}

must be some way to force building these first...

. /opt/ros/${ROS_DISTRO}/setup.sh

NO - install from source

sudo apt install ros-humble-control-msgs

==>

colcon build --packages-select realtime_tools colcon build --packages-select control_msgs colcon build --packages-select ros2_control_test_assets colcon build --packages-select controller_manager_msgs colcon build --packages-select joint_limits colcon build --packages-select rqt_controller_manager colcon build --packages-select hardware_interface colcon build --packages-select hardware_interface_testing colcon build --packages-select controller_interface colcon build --packages-select controller_manager colcon build --packages-select transmission_interface colcon build --packages-select ros2controlcli colcon build --packages-select ros2_control

colcon build --symlink-install

source install/setup.bash ros2 launch ros2_control_demo_example_2 view_robot.launch.py

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ colcon build --symlink-install [1.807s] WARNING:colcon.colcon_core.package_selection:Some selected packages are already built in one or more underlay workspaces: 'controller_interface' is in: /workspaces/isaac_ros-dev/install/controller_interface, /opt/ros/humble 'hardware_interface' is in: /workspaces/isaac_ros-dev/install/hardware_interface, /opt/ros/humble 'controller_manager' is in: /workspaces/isaac_ros-dev/install/controller_manager 'ros2_control_test_assets' is in: /workspaces/isaac_ros-dev/install/ros2_control_test_assets If a package in a merged underlay workspace is overridden and it installs headers, then all packages in the overlay must sort their include directories by workspace order. Failure to do so may result in build failures or undefined behavior at run time. If the overridden package is used by another package in any underlay, then the overriding package in the overlay must be API and ABI compatible or undefined behavior at run time may occur.

If you understand the risks and want to override a package anyways, add the following to the command line: --allow-overriding controller_interface controller_manager hardware_interface ros2_control_test_assets

colcon build --packages-select controller_manager_msgs joint_limits controller_interface transmission_interface controller_manager ros2controlcli

  • controller_manager_msgs
  • joint_limits
  • controller_interface
  • transmission_interface
  • controller_manager
  • ros2controlcli