yzqin/dex-hand-teleop

hand movement during teleop

suryaprabhakar414 opened this issue · 4 comments

The code is running smoothly, but when i am running the teleop_collect_data.py the hand movement and grasping in the sapien are not as sensitive as shown in the demonstration. I tried plotting the visualization as well.

I also tried plotting the estimated x,y,z position of the hand w,r,t time. I plotted 3 positions

XYZ_vs_time

blue -> current_pose which i got from self.robot.get_qpos()
green -> estimated robot_qpos given as input to the mano_robot.control_robot()
orange -> is the next estimated position of the robot_qpos using the LPFilter

It can be seen that initially when the hand is stable the blue(current_pose), green(estimated position) and the orange(next estimated robot_qpos) almost over laps, but when i move my hand, it takes atleast 10-15 frames for the blue curve to overlap the orange and green curve.

To ensure stability i am also dividing the estimated robot's qpos's translation value by 30, because without that the hand in sapien is moving even when my hand is stable.

Can you please help me with this?

It seems likely that the root of the problem is the computational capacity of your machine. The model demands robust capabilities in detection, simulation, and rendering tasks. If the GPU/CPU lacks sufficient speed, it would result in an inability to achieve real-time processing. This shortfall can manifest as erratic or unsmooth control inputs from the user's viewpoint. Could you share the detailed specifications of your hardware to better assess the situation?

Also, could you provide a video for the above-mentioned issue? Thanks!

Yeah Sure,
Here are the details of my hardware:
NVIDIA RTX A1000 Laptop GPU 4GB
NVIDIA Driver Version: 535.154.05 CUDA Version: 12.2

The issue was resolved.