Tests on the robot
vvasco opened this issue · 2 comments
We want to test the new features introduced and tested in simulation on the robot, specifically the obstacle detection alone and then within the TUG.
Finally I will do several tests of the whole demo to make sure everything works properly.
I updated the default parameters of obstacleDetector
and managerTUG
in order to:
- allow
obstacleDetector
to detect legs (few points are detected, thus I lowered the threshold on the minimum number of points inside a cluster to be valid); this might generate more false positives, thusmanagerTUG
now discards obstacles if they don't occur within a certain frequency; - reduced the radius for stopping navigation if an obstacle is encountered (from
1.5 m
to1 m
), since walls behind the robot easily fall inside this radius.
Obstacle detection out of the box
In this experiment, I ran the obstacle detection out of the box.
The detected obstacles in the middle viewer are the walls, the table legs and my legs.
When commanding the robot to reach 2.5 m
along the x
direction, it stops before because the legs fall into the radius of 1 m
. We can verify this with the get_state
command, whose output tells us that the robot has reached 1.5 m
.
Obstacle detection within the demo
In this experiment, I put an obstacle (a box) along the robot's path.
The robot stops and asks to remove the obstacle. When this is removed, the interaction starts again.
Here a (rudimental) video showing the functionality (I need to make a proper one).