/RobotIQ

Empowering Mobile Robots with Human-Level Planning

Primary LanguagePythonGNU General Public License v3.0GPL-3.0


https://github.com/emmarapt/RobotIQ/blob/main/images/RobotIQ_logo.png

An open-source, easy-to-use, and universally adaptable robotic library suite for any robot.

https://github.com/emmarapt/ASPiDA_ChatGPT/blob/main/images/Turtlebot3.png

An AI-ROS-based framework for empowering mobile robots with human-level planning by text or voice commands bridging the gap between humans and robots by leveraging secure and precise instructions from ChatGPT for generating robotic-related code.
View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Prerequisites
  3. Installation
  4. How to run
  5. Mobile robot navigation using deep reinforcement learning
  6. Video Demos
  7. Real-life Experiments
  8. License
  9. Contact
  10. Acknowledgments
  11. References

About The Project

This project aims to enhance the capabilities of mobile robots by equipping them with planning abilities akin to human-level thinking. This endeavor seeks to bridge the gap between humans and robots by leveraging secure and precise instructions from ChatGPT guided by natural language. Toward this direction, we introduce a high-level function library that unlocks the challenges of AI-generated code drift while employing Large Language Models (LLMs) for generating robotic-related code that may compromise the robot's structural integrity or the failure of the assigned task. This function library can be seamlessly integrated into any robotic platform through APIs, thereby allowing ChatGPT to comprehend user objectives expressed in natural language format and translate them into a logical sequence of high-level function calls.

Prerequisites

Step 1. Install Ubuntu and ROS

Step 2: Setup catkin_ws for ROS with python 3.6

For ROS melodic & Kinetic you can refer to the following guide.

Step 3. Clone the following repository:

cd catkin_ws/src
git clone https://github.com/emmarapt/turtlebot3_home_service_challenge.git

Note: To install additional packages, please refer to the following e-manual.

Build catkin with Python3.6
cd ~/catkin_ws
source py36env/bin/activate
export PYTHONPATH=$PYTHONPATH:/usr/local/lib/python3.6/dist-packages 
catkin build --cmake-args -DCMAKE_BUILD_TYPE=Release -DPYTHON_EXECUTABLE=/your/path/catkin_ws/py36env/bin/python3.6 -DPYTHON_LIBRARY=/usr/lib/python3.6/config-3.6m-x86_64-linux-gnu/libpython3.6m.so -DPYTHON_INCLUDE_DIR=/usr/include/python3.6m

Installation:

Step 1. Clone the repo

git clone https://github.com/emmarapt/ASPiDA_ChatGPT.git

Step 2. Install the required system packages:

pip install -r requirements.txt

Note: TensorFlow version for CPU & GPU support may differ based on your system requirements.

ChatGPT for Python 3.6:

To install openai==0.27.2 do the following steps:

git clone https://github.com/openai/openai-python.git

Build from source:

cd openai-python
python setup.py install

An error will occur:

from contextlib import asynccontextmanager

ImportError: cannot import name 'asynccontextmanager'

Navigate to "/your/path//openai-python/openai/api_requestor.py" and change:

from contextlib import asynccontextmanager

to

from async_generator import asynccontextmanager

Uninstall openai if any and run:

python setup.py install

Step 3. Simulate keyboard events

Open a terminal, go to Terminal->Preferences, and enable mnemonics.

How to run:

  1. Navigate to the project directory via a terminal and run:

    python ChatBot.py or VoiceBot.py
  2. Do not interrupt until initialization.

  3. Text or speak.

Enjoy!

(back to top)

Mobile robot navigation using deep reinforcement learning

This repository also contains an OpenAI reinforcement learning setup for the Turtlebot3 in Python 3.6 allowing the use of Spinning Up / Stable Baselines / Baselines deep reinforcement learning algorithms in the robot navigation training.

  1. Install an OpenAI gym extension for using Gazebo known as gym-gazebo [Optional]
  2. Install at least one: OpenAI Spinning Up, Stable Baselines, Baselines in python3.6 virtual env

There are two ways of starting training and evaluation:

  1. If gym-gazebo installed: Use Gazebo gazebo_env.GazeboEnv in env.py and run python robot_main_####.py . This will automatically launch the robot configurations in gazebo and start training/evaluation.

    Note: Add gazebo_env.GazeboEnv.__init__(self, "competition.launch") in env.py

  2. Use gym.Env in env.py,
cd ~/catkin_ws
source /opt/ros/$ROS_DISTRO$/setup.bash 
source /catkin_ws/devel/setup.bash
roslaunch turtlebot3_home_service_challenge_simulation competition.launch
  1. Run python robot_main.py to start training/evaluation
  1. Go to aspida_wrapper.py and specify your trained policy.

(back to top)

Video Demos:

Now that you have the ASPiDA ChatGPT running, you can use it to empower mobile robots using text or voice commands. Here is a video demonstration of how ChatGPT initializes this project using ROS, Gazebo, and Rviz.

In the following videos, we explore the groundbreaking fusion of cutting-edge technology and robotics - ChatGPT using both the ChatBot and the VoiceBot. ChatGPT seamlessly translates natural language through voice commands into robotics objectives, empowering an assistive robot designed to aid the elderly in a realistic virtual environment.

ChatBot VoiceBot

Real-life Experiments

To validate the efficiency of ASPiDA ChatGPT we evaluate the custom robotic library utilizing the VoiceBot in a real-world scenario: From Picking to Placing an object.

Watch the Turtlebot3 Waffle Pi with OpenManipulator-X Showcase Precision: From Picking to Placing using voice commands!

Watch a Turtlebot3 real-life experiment

In this video, Turtlebot3 adeptly executes a precision-driven task—from picking to placing a bottle of watter in a specific target location. The intricacy of the operation becomes even more impressive as the entire process is orchestrated within an augmented reality (AR) environment, utilizing distinct AR markers for accurate positioning. Watch as the Turtlebot3 identifies, approaches, and seamlessly grasps the water bottle, perfectly centered within an AR markers

License

Distributed under the MIT License. See LICENSE for more information.

(back to top)

Contact

Emmanuel K. Raptis - github - emmarapt@iti.gr

Athanasios Ch. Kapoutsis - github - athakapo@iti.gr

(back to top)

Acknowledgments

This research has been co-financed by the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH-CREATE-INNOVATE (T2EDK-02743).

(back to top)

Cite As:

(Not published yet)