/Engineering-Thesis

Mobile robot control system, using gestures. System contains deep neural network to classify gesture captured by camera, comunication module (bluetooth 4.0 low energy) and Arlo - mobile robot (programable in C language)

Primary LanguagePythonMIT LicenseMIT

Using deep neural networks to recognize hand gestures to control a mobile robot using a camera.

The purpose of this project was to build a system that allows user to control the Arlo mobile robot using static hand gestures, with use of camera. The project consists of three parts, the first was to develop script that supposed to recognize the gestures controlling the robot. For this purpose author used convolutional neural network, which had to be properly trained. The next step was to create a robot program that had to properly react to application commands. The last step was to handle communication between programs, Bluetooth 4.0 Low Energy standard was used for this purpose. Unfortunately, during the final control test Bluetooth module has been damaged, in this case the author decided to use simple serial communication instead. The main assumptions of the project were to achieve real-time processing system and achieve high accuracy of hand gesture recognition.
Arlo: https://www.parallax.com/product/arlo-robotic-platform-system

Alt Text

Full video: https://www.youtube.com/watch?v=j6qOpACT1z0

Built With

Requerements

  1. Mobile Robot arlo
  2. Bluetooth 4.0 Le module
  3. Laptop with bluetooth communication enabled

Installation

  1. Clone the repo
  2. Install python 3
  3. Install python dependencies, such as: Pytorch, Bleak, PyQt
  4. Run script "mainAplication.py"

Description

mobile robot Arlo

arlo

Gui interface:

gui

  1. Manual command list,
  2. Send manual chosen command,
  3. Command window, contains information about, gestures captured, execution command status etc.,
  4. Frames per second,
  5. On/Off neural network processing,
  6. On/Off automatic gesture detection. If gesture is detected for more than 2s script automatically sends command to Arlo. Can be also activated via Ctrl.
  7. Connect/Disconnect,
  8. Emergency stop, cancels all commands actual beeing in execution. Can be also activated via Shift
  9. Commands and probability of ocurance in actual dataframe
  10. Dataframe preview in 640x480 resolution, grey area is ignored in gesture detection process.

    gesty gesty

Controlling commands:

  1. Default
  2. Turn left
  3. Turn right
  4. Turn around
  5. Forward
  6. Backward

Setting commands:

  1. Speed up
  2. Slow down
  3. Bigger turn angle
  4. Smaller turn angle
  5. Bigger step
  6. Smaller step
  7. Emergency Stop

License

Distributed under the MIT License. See LICENSE for more information.

Author

Sylwester Dawida
Poland, AGH
2020