/AirFlow

AirFlow - Gesture based air canvas using Machine Learning and Computer Vision.

Primary LanguagePythonMIT LicenseMIT

AirFlow

Description

AirFlow is a project that utilizes hand gestures for controlling and interacting with digital content. It employs computer vision techniques to track hand movements captured by a webcam and translates them into various actions, such as drawing on a canvas or controlling applications.

AirFlow-ezgif com-video-to-gif-converter

Features

  • Hand gesture recognition using Mediapipe framework
  • Real-time interaction with digital content
  • Supports multiple colors and drawing modes
  • Clear button for resetting the canvas
  • Easy setup and usage

Installation and running:

  1. Clone the repository:

    git clone [https://github.com/](https://github.com/MadhumithaKolkar/AirFlow.git)
    
  2. Install the required packages:

  python -m pip install opencv-python numpy mediapipe
  1. Run the Python script:
  python AirFlow.py
  1. Use hand gestures to interact with the application:

    • Move your hand to draw on the canvas
    • Change colors by selecting different regions on the screen
    • Clear the canvas by pressing the "CLEAR" button
    • Exit the application by pressing the "q" key

Contributing

Contributions are welcome! Here's how you can contribute:

  • Fork the repository
  • Create your feature branch (git checkout -b feature/your-feature)
  • Commit your changes (git commit -am 'Add some feature')
  • Push to the branch (git push origin feature/your-feature)
  • Create a new Pull Request

Credits

  • Mediapipe - Hand tracking framework
  • OpenCV - Computer vision library

Creators

  • Madhumitha Kolkar

License

  • This project is licensed under the MIT License - see the LICENSE file for details.