Quest3 Hand tracking issue
Closed this issue · 2 comments
I run the script teleop/teleop_hand.py [https://github.com/OpenTeleVision/TeleVision]
It use vuer for hand tracking. the device is meta quest3.
async def on_hand_move(self, event, session, fps=60):
try:
self.left_hand_shared[:] = event.value["leftHand"]
self.right_hand_shared[:] = event.value["rightHand"]
self.left_landmarks_shared[:] = np.array(event.value["leftLandmarks"]).flatten()
self.right_landmarks_shared[:] = np.array(event.value["rightLandmarks"]).flatten()
except:
pass
while right hand not in tracking, the value of event.value["rightHand"] will be unstable
@csjiyw This seems to be a user-land issue. Before I get into the details, here are some updates on the Hand API
- The Hand API has changed in the newest version of vuer
v0.0.33
| Documentation. - The new hand pose data returns the full transformation matrix of each of the landmarks. Giving you orientations in the wrist reference frame.
Explanations of behavior
When landmarks associated with a hand is not visible, the headset has built-in mechanism for inferring what they are.
We use the WRIST
joint's isEmulated
attribute to infer if the hand is out of view. If it is, I remove the hand pose from the message data and hide the hand visual component from view.
If other joints are emulated, I do not remove them. You can see Quest's hand pose estimation at play when a finger is occluded.
btw, @chengxuxin did you run into these issue?
We have since them repaced a third-party implementation, with our own that is more reliable. Recommend the newest release.