hsp-iit/project-ergocub

Reliable Grasping/object manipulation In Simulation

Closed this issue · 4 comments

This issue is to track the status of the grasping in the simulation. The goal is to perform bimanual manipulation without attaching the object to the hand through a fake link. The following issues hinder reliable grasping

  • The thumb hinders proper contact with the palm. This is resolved by manually folding the thumb and changing the default yaw of the wrist to allow for better contact as well as activating the grasp when the thumb is above the top of the object. With @xEnVrE's suggestion I reduced the height of the object as well to prevent any contact with the thumb.
  • Even if the palm is in contact there is at max one contact at the origin of the wrist and one at the start point of the finger joints. The only time I have more than one contact for the fingers is when the are curled and the tips are in contact with the object. No contact is detected along the finger shafts.
  • Initially I thought the friction coefficients were the issue but adding individual friction coefficients for each component of the hand didn't change the outcome, the object still slipped through.

Below is video of a filed grasp attempt.

grasp_fail2.mp4

Below is the video of a somewhat successful attempt. To achieve this, I needed to manually put the wrist in torque mode which technically according to @xEnVrE should not have made any difference. Of course the friction coefficients were also set individually for each wrist and fingers.

grasp_success2.mp4

I will post more videos with reduced object size and more successful manipulation with @Woolfrey 's module. Suffice it to say that when we change the pose of the payload, the object usually slips unless we make more tweaks which I will explain the posts following this.

One suggestion by @xEnVrE was to have a force sensor in the palm and use that as a control input in @Woolfrey 's module. The gazebo side has been implemented but testing with the control still needs to checked for viability.

xEnVrE commented

To achieve this, I needed to manually put the wrist in torque mode which technically according to @xEnVrE should not have made any difference

I think there was a misunderstanding :) what I said is that fingers cannot be controlled in torque mode in both the real robot and in simulation.

As regards the wrist, I don't know :)

cc @vigisushrutha23

I think there was a misunderstanding :) what I said is that fingers cannot be controlled in torque mode in both the real robot and in simulation.

I made a mistake in the mention as well. I just set all the finger joints in torque mode for the video posted...no the wrist or any part of the hand,

Here is a somewhat more successful grasping with a object with smaller height and the fingers curling around the object for somewhat better but nor reliable contact.

grasp__curl_fingers-2023-07-17_09.28.03.mp4

With the linkattacher plugin i was able to run some successful simulation runs but since I combined it with the walking, there are issues when I run the screen-recorder which cause the gazbeo real-time-factor to drop. This in-turn causes the walking controller to fail in long simulations. I am uploading a short video and closing the issue.

Object_human4-2023-09-15_00.23.50.mp4