This repository contains basic application which can be used to convert any planar surface into a virtual touch screen. This project will detect touch and hover actions performed by user on projected screen and converts the co-ordinates according to your machine screen.
- Microsoft Kinect Sensor
- USB 3.0 compatible machine
- Setting up of the sensor
- Calibration
- User detection test
- Post-Tracking process
- Hand position treated as Hover
- Hand position treated as Touch
- Connect the sensor to machine.
- Connect machine to projector and project on plane surface.
- Start .wpf application.
- Caliberate the screen size by selecting top left and bottom right corner of projected screen size.
- Stand in Front of Sensor; Let sensor detect the body and display the notification for same.
- After body detection; You can use touch and hover functionality.
To fix a bug or enhance an existing module, follow these steps:
- Fork the repo
- Create a new branch (git checkout -b improve-feature)
- Make the appropriate changes in the files
- Add changes to reflect the changes made
- Commit your changes (git commit -am 'Improve feature')
- Push to the branch (git push origin improve-feature)
- Create a Pull Request