The Vision Assistance App is designed to empower individuals who are completely or heavily partially blind by helping them navigate the real world independently. Leveraging AI-powered object detection, haptic feedback, and emergency features, the app enhances accessibility and safety.
- Uses the mobile deviceβs camera to detect objects in real-time.
- Divides the field of view into a 9-quadrant grid (3x3) for precise object localization.
- Estimates object distance and provides intensity-based haptic feedback.
- Recognizes common household items and human faces accurately.
- On-Screen Buttons:
- Describe Object in Detail: Short description on tap, detailed info on long press.
- Call Emergency Contact: Quick dial to a preset contact (family, caretaker, etc.).
- Customizable Buttons:
- Voice commands π
- Saving object details for later reference π
- Adjusting vibration intensity β
- Toggling features on/off π΄
- Physical Button Shortcuts: Volume and power button combinations for quick actions.
- GPS Functionality: Navigation and location tracking πΊ
- Fall Detection: Auto emergency alert activation π‘
- Medical Information Storage: Quick access to health data for emergencies π₯
- Automated Emergency Messages: Sends location and medical info to hospitals and family members π©
- Voice Assistance: Audio cues for enhanced interaction π
- Customizable Grid Layouts: Choose between 3x3 or 2x4 grid systems π
- AI-Powered Object Recognition: Powered by TensorFlow Lite, OpenCV, or YOLO π€
- Offline Functionality: Essential features work without internet β‘
- Personalized Vibration Patterns: Different haptic responses for various object types
- Launch the app π²
- Point the camera at your surroundings π₯
- Receive real-time feedback via haptic cues π³
- Use on-screen or physical controls for additional assistance π
- Enable GPS navigation or emergency alerts for added security πΊ
- Mobile Platforms: Java/Kotlin (Android) | Swift (iOS)
- Machine Learning: TensorFlow Lite / OpenCV / YOLO
- Haptic Feedback: Android Vibration API, iOS Core Haptics
- GPS & Emergency Alerts: Google Maps API, Twilio API
- π Multi-language support for global accessibility
- π΅ Dynamic sound cues to aid navigation
- π¦Ύ Integration with smart wearables (e.g., smart glasses, AI-powered assistants)
- π Cloud-based AI training for improved object recognition
git clone https://github.com/yourusername/Vision-Assistance-App.git- Open the project in Android Studio/Xcode.
- Build and install the app on a test device.
This project is licensed under the MIT License.