/ThirdEye

Smart Glasses for the Visually Impaired

Primary LanguageJava

The smart glasses, equipped with ultrasonic sensors, camera, buzzers and a raspberry pi zero, analyze the user's surroundings. The glasses detect whom the user is talking to and objects around it through image recognition using a combination of personally developed image filtering in order to get the most important subjects in a photo in combination with the use of Google Vision and IBM Watson. We used a combination of three ultrasonic sensors running simultaneously on different threads in order to provide and process data in real time. # PennApps. The user speaks voice commands through the mobile application in order to trigger responses from the smartglasses.

Feature List

Social: By saying "Who is that?", the app will perform facial recognition on a person -We have also paired this facial recognition with the user's Facebook friends to find a match

Financial:User can verify transactions through the Nessie API. If there's is an issue, the Capital One Service Line is called By saying "Count my money", the glasses will count how much money the person is looking at. Using Nessie, we are able to take a picture of a check and deposit the amount in the user's bank account

General: By saying "What is that?", the app will report with the best guess as to what the person is looking at Designed for urban living, the app will alert the user when nearing a stopped pedestrian crossing