Video of the project is https://www.youtube.com/watch?v=KIMvJKLBZRc&feature=youtu.be
Chaoran Jin chaoran@student.unimelb.edu.au
Eldar Kurmakaev ekurmakaev@student.unimelb.edu.au
Tianqi Zhou tiazhou@student.unimelb.edu.au
file | location |
---|---|
Interface | /interface/Main.py |
Arduino code | /sketch_may06a/sketch_may06a.ino |
form is mainWindow for welcome and sample hits dashboard is colorDashboard for users decide the relation between colors and emotions camera is cameraWindow for displaying the camera capture and light project at backend
Remember fix details in recognition/noStreamCapture.py for hardware connection
- change Url in blink() and final_output() when network environment changed
- Check if requests lines are commented or not , they are commented sometime because saving testing time
- make sure default model.txt and saturate.txt exist in root lib before run. If they are not here, commenting those two method in class VideoGet and run once to generate
- change the API key in processInAzure() into yours if exist one down