The project consists in a system designed to teach and correct the movement of a blind student that aims to learn a movement. With a OpenCV algorithm and a wearable, we made it possible to these students have a better perception to wheather they are making the gesture correctly or not.
The correction of the movement is passed to the user with an haptic feedback in real time while he/she is executing the gesture.
This work proposes a human gesture evaluation with visual detection and haptic feedback as an additional tool. The purpose is to utilize gesture monitoring with a visual detection alongside markers to execute a gesture following and then, send a haptic feedback to the user.
The project evolved to a different direction. To filter the gesture, we use a machine learning algorithm. At first, a SOM neural network was tested, but then the problem fitted better with a network called GWR.
- OpenCV 3.4.x Installation Guide
- ArUco 3.0.10 Installation Guide
This project was initially executed on a Linux environment with the distribution Ubuntu 16.04. The OpenCV version was 3.4.1 and the ArUco version was 3.0.10. The machine had an Intel Core i5 7th generation and 8GB of RAM Memory.
- Regular motors activation
- Alternated motors activation
to do
to do
The main code is in the ../structure directory. To execute it you must, first choose the mode of execution by editing the main.cpp file and the project that contains the NN algorithm (GWR).
It has 4 modes:
- Recording a new Gesture
- Running the data through the GWR
- Running the correction algorithm without the Wearable
- Running the correction algorithm with the Wearable
Open the file main.cpp and edit it to look like this:
Vision vision(argc, argv);
Trajectory trajectory("../data/square.csv");
trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));
trajectory.saveMovement("../data/new_movement.csv");
vision.record("../../Videos/random_test.avi");
while(1){
vision.calculateTagCenter();
if (vision.isTargetOn()) {
trajectory.savePoint(vision.getCenter());
vision.saveVideo();
}
vision.show();
vision.saveVideo();
}
trajectory.endSaving();
to do
Open the file main.cpp and edit it to look like this:
Vision vision(argc, argv);
Trajectory trajectory("../data/square.csv");
trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));
trajectory.saveMovement("../data/random_test.csv");
while(1){
vision.calculateTagCenter();
vision.drawTrajectory(trajectory, trajectory.getCurrentPointId());
if (vision.isTargetOn()) {
trajectory.setNextPoint0(vision.getCenter());
vision.drawError(vision.getCenter(), trajectory.getCurrentPoint());
trajectory.savePoint(vision.getCenter());
}
vision.show();
}
trajectory.endSaving();
Open the file main.cpp and edit it to look like this:
Vision vision(argc, argv);
Weareable weareable;
Trajectory trajectory("../data/square.csv");
trajectory.unnormalize(Point(FRAME_WIDTH/2, FRAME_HEIGHT/2));
trajectory.saveMovement("../data/random_test.csv");
weareable.setIP((char*)"10.6.4.107");
weareable.start();
while(1){
vision.calculateTagCenter();
vision.drawTrajectory(trajectory, trajectory.getCurrentPointId());
if (vision.isTargetOn()) {
trajectory.setNextPoint0(vision.getCenter());
vision.drawError(vision.getCenter(), trajectory.getCurrentPoint());
weareable.send(trajectory.getError(vision.getCenter()));
trajectory.savePoint(vision.getCenter());
}
vision.show();
}
trajectory.endSaving();
Abstract: Learning a gesture pertains to learning a expression of motion by a human, involving the hands, arms, face, head, and/or body. In this work, we propose to employ haptic feedback as an additional tool in the gesture following/evaluation loop. Accordingly, the user wears a haptic wearable device in the form of a bracelet which vibrates according to the trajectory error. Our research hypothesis is then to investigate whether such haptic device aids the user in correcting his movement in relation to the prerecorded trajectory.