/Nindamani-the-weed-removal-robot

Nindamani, the AI based mechanically weed removal robot

Primary LanguagePythonMIT LicenseMIT

Nindamani the weed removal robot

AwardπŸ…

Our project "Nindamani: The Weed Removal AgriRobot" won 1st prizeπŸ… in Autonomous Machines & Robotics category at NVIDIA's AI at the Edge challenge.

Check out the articles on NVIDIA's blog, hackster.io page, and New Scientist's news letter

NVIDIA challenge winner

Note: Please do mention our title if you find this project helpful

Project Details

Nindamani, the AI based mechanically weed removal robot, which autonomously detects and segment the weeds from crop using artificial intelligence. The whole robot modules natively build on ROS2. Nindamani can be used in any early stage of crops for autonomous weeding.

In this following repository, you will find instructions for software installation and control mechanism for Nindamani robot.

Features:

  • Fully ROS2 compatible
  • Battery Operated
  • Runtime upto 8-10 hours
  • Robotics Arm based weed removal
  • Weed detection accuracy upto 85%
  • Easy to Operate

Software Specifications:

Parameter Value
Robotics OS ROS2.0 Dashing Diademata
System Ubuntu 18.04 LTS
Communication Wireless , UART(internal motor control)
AI Framework Keras
Programming Language Python3 & C

Hardware Specifications:

Parameter Value
Degrees of freedom 3 DOF
Error Β±2 mm
Payload 1.5 kg
Weight 35 kg
Height 740 to 860 mm
Width 980 mm
Arm Reach 200x200 sq mm
Processor board Jetson nano Dev Kit
Microcontroller Arduino Mega
Servo Motor 12V DC, 200RPM, 32kgcm H.Torque
Stepper Motor 48V, 6A, Nema 34, 87 kgcm H.Torque
Camera RPi cam ver.2
Wifi card Intel 8265
USB-TTL cable PL2303HX chip
Battery 48V 30ah

Packages

In this section we will install all the necessary dependencies in order to be able to launch nindamani robot:

  • nindamani_agri_robot - integrate all launch node of nindamani robot
  • rpicam_ai_interface - controlling the rpi camera with AI interface
  • servo_control - controlling the servo motors with ROS2 interface
  • stepper_control - controlling the multiple stepper motors with ROS2 interface

Installation on Jetson Nano Dev Kit

1. NVIDIA Jetpack SDK

2. Prerequisites and Dependencies for TensorFlow

3. ROS2 (Dashing Diademata)

4. Arduino

5. OpenCV 3.4.4

6. Wifi

Create ROS2 Workspace

  • follow this steps:
  mkdir -p ~/nindamani_ws/src
  cd ~/ros2_mara_ws
  colcon build
  cd src
  git clone https://github.com/AutoRoboCulture/nindamani-the-weed-removal-robot.git

Clone the Mask R-CNN GitHub Repository:

  1. Code: git clone https://github.com/matterport/Mask_RCNN.git
  2. Copy this cloned repo to rpicam_ai_interface package: cp Mask_RCNN rpicam_ai_interface/.
  3. Run command:
    • cd rpicam_ai_interface/Mask_RCNN
    • sudo python3 install setup.py
  4. Confirm the Library Was Installed: pip3 show mask-rcnn

Download preTrained Model weights

  • Link for MASK-RCNN preTrained model
  • Copy preTrained weights to rpicam_ai_interface package:
    mkdir rpicam_ai_interface/preTrained_weights
    cp mask_rcnn_trained_weed_model.h5 rpicam_ai_interface/preTrained_weights/.
    

Follow Folder Structure:

nindamani_ws
β”œβ”€β”€ build
β”œβ”€β”€ install
β”œβ”€β”€ log
└── src
  β”œβ”€β”€ nindamani_agri_robot
  β”‚   β”œβ”€β”€ launch
  β”‚   └── scripts
  β”œβ”€β”€ rpicam_ai_interface
  β”‚   β”œβ”€β”€ scripts
  β”‚   β”œβ”€β”€ preTrained_weights
  β”‚   └── Mask-RCNN
  β”œβ”€β”€ servo_control
  β”‚   β”œβ”€β”€ config
  β”‚   β”œβ”€β”€ scripts
  β”‚   └── srv
  └── stepper_control
      β”œβ”€β”€ config
      β”œβ”€β”€ scripts
      β”œβ”€β”€ src
      └── srv

Compile nindamani_ws

  • Follow steps:
    cd nindamani_ws
    colcon build
    

Dependency

Stepper Motor library implementation on Arduino

Launch nindamani robot

  • Make sure source setup.bash in bashrc before ROS2 launch command: echo "source /home/<user-name>/nindamani_ws/install/setup.bash" >> ~/.bashrc
  • ROS2 Launch command: ros2 launch nindamani_agri_robot nindamani_agri_robot.launch.py

Demo video | Proof of Concept

IMAGE ALT TEXT HERE

Potential Improvements

We have presented the concept that how weeds can be detected from crops using Artifical Intelligence and through delta arm robot weeds are removed autonomously. It's not perfect of course as you can see in the video link but can be improved. Here are some of our ideas which can improvise this robot in future:

  • Gripper design enchancement with end tip as arrow shaped.
  • Delta arm reach can be improved with high torque stepper motor.
  • With RTK-GPS and 4 wheeled drive + 4 wheel steering implementation on robot, it will make whole robot working autonomously.
  • Need 3D mapping of land using Lidar, for finding variations in height of crops, weeds and ridge.

References

  1. Mask R-CNN for Object Detection and Segmentation
 @misc{matterport_maskrcnn_2017,
  title={Mask R-CNN for object detection and instance segmentation on Keras and TensorFlow},
  author={Waleed Abdulla},
  year={2017},
  publisher={Github},
  journal={GitHub repository},
  howpublished={\url{https://github.com/matterport/Mask_RCNN}},
}
  1. Train Mask-RCNN model on Custom Dataset for Multiple Objects

  2. Delta Robot Simulation on Gazebo using MARA-Env

Developer's Contact Detail

Kevin Patel
Nihar Chaniyara
Email: autoroboculture@gmail.com