positive-intentions/chat

Idea: WebXR hand meshes in augmented reality

Opened this issue · 0 comments

details to be investigated:

Babylon includes hand meshes in the package. they are right/left hand models, based on the WebXR specs found here: https://immersive-web.github.io/webxr-hand-input/

If you are able to manipulate the mesh using the 25 points required for WebXR hand tracking, you can “emulate” it.

the code that adjust the hand meshes using this data is here: https://github.com/BabylonJS/Babylon.js/blob/master/packages/dev/core/src/XR/features/WebXRHandTracking.ts

there is hand-post estimation in the app based on the technology described here: https://blog.tensorflow.org/2021/11/3D-handpose.html it can be tested in the app here: https://chat.positive-intentions.com/#/hands

the result of the models estimations is that there is a 3d point-cloud of the pose detection.

in theory it is possible to get the handpose-estimation mapped into the BabylonJS augmented reality as seen on the app here: https://chat.positive-intentions.com/#/verse