/assembly_projection_mapping_teaching

This ROS package provides an immersive projection mapping system for interactively teaching assembly operations.

Primary LanguageC++BSD 3-Clause "New" or "Revised" LicenseBSD-3-Clause

assembly_projection_mapping_teaching

Overview

This ROS package provides an immersive projection mapping system for interactively teaching assembly operations.

This project has the following associated paper:

Modeling of video projectors in OpenGL for implementing a spatial augmented reality teaching system for assembly operations

Currently it has assembly instructions for a Mitsubishi M000T20873 starter motor, but it can be easily reconfigured to other tasks (you just need to add the content to the media folder and change the yaml/assembly.yaml file).

Assisted assembly of a starter motor

Video 1: Assisted assembly of a starter motor

Immersive natural interaction for assisted assembly operations

Video 2: Immersive natural interaction for assisted assembly operations

Object pose estimation for assisted assembly operations

Video 3: Object pose estimation for assisted assembly operations

Projection mapping for assisted assembly operations

Video 4: Projection mapping for assisted assembly operations

Disassembly of a starter motor

Video 5: Disassembly of a starter motor

Software installation

Quick overview of the main installation steps:

Notes:

Before compiling packages, check if you have installed all the required dependencies (use rosdep to speedup this task):

cd ~/catkin_ws
rosdep check --from-paths src --ignore-src

Hardware setup and calibration

This package was tested with a Asus Xtion Pro Live for object recognition and a Kinect 2 for bare hand human machine interaction.

You will need to install the hardware drivers and calibrate them.

Sensors drivers:

Sensors calibration:

You will also need to calibrate the projector (using for example this tool) and update the intrinsics parameters of the rendering camera in worlds/assembly.world.

Finally, you will need to compute the extrinsics (position and rotation in relation to the chessboard origin) of the sensors and projector (using the charuco_detector) and update launch/assembly_tfs.launch.

Usage

The teaching system can be started with the following launch file:

roslaunch assembly_projection_mapping_teaching assembly.launch

You can start several modules of the system individually (such as sensors, rendering, perception). Look into the launch folder and tests.txt.

After the system is started, you can navigate between the textual / video instructions using the projected buttons and can also pause / play / seek the video. In the last step it is projected into the workspace the outline of the 3D model for visual inspection and assembly validation. Check the videos above for a demonstration of the system functionality.

Main input topics

  • command (std_msgs::String) | Topic for processing string commands (listed below) for changing the content being projected (# symbol corresponds to a number)
    • "next_step"
    • "previous_step"
    • "first_step"
    • "last_step"
    • "step: #"
    • "play_video"
    • "pause_video"
  • change_step (std_msgs::Int32) | Topic for changing the projection content to the specified assembly step

Main output topics

  • current_step (std_msgs::Int32) | Latched topic informing the assembly step currently being projected
  • status (std_msgs::String) | Latched status topic informing the changes being performed to the projected content
    • "next_step"
    • "previous_step"
    • "first_step"
    • "last_step"
    • "move_to_step"
    • "step_number: #"
    • "running"
    • "paused"
    • "seek_video: #"