The main aim of this beta application is the one of providing some innovative ways in order to interact with the computer.
Have you ever thought about the possibility of avoiding the use of 'remote click control'?
The idea has been conceived as an interaction with a Google Home device or a Google Assistant app installed on your phone and with simple gestures recognized by some computer vision and deep learning algorithms.
The user without any touch can easily open the Keynote app with the related presentation to show and by some vocal commands be able to start the full screen presentation.
We are going to show here an example of a simple interaction vocal command:
User: 'Let's start'
End Device: 'Starting your presentation'
In order to let our app be simple for testing we decided to declare 2 kind of gestures up to know which can be used for switching slides: forward and backward.
Respectively:
- Number 4 (by hand) - to go forward
- Number 5 (by hand) - to go backward
The limit we had, was that in order to use different kind of gestures, like swiping or different numbering we would have needed a much bigger dataset from which to retrieve the right information to train the model.
Plus, we would have needed a different hardware setup !
In order to end the presentation you can simply talk to your Google Home or Google Assistante device and say something like: 'Close the presentation'
- Bruno Marafini - EIT Digital Master School - Cyber Security
- Mirko Schicchi - EIT Digital Master School - Cyber Security
- Filippo Tessaro - ICT University of Trento - Data Science