Hand Physics Toolkit (HPTK) is a toolkit to build physical hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. This toolkit can be combined with MRTK-Quest for UI interactions.
- Data model to access hand parts, components or calculated values with very little code.
- Code architecture based on MVC-like modules. (Wiki). Support to custom modules. (Wiki).
- State-of-the-art hand physics. Configurable in detail through configuration assets.
- Platform-independent. Tested on VR/AR/non-XR applications
- Input-independent. Use handtracking or controllers.
- Scale-independent. Valid for any hand size.
- Define strategies to deal with loss of tracking.
- Physics-based touch/grab detection.
- Tracking noise smoothing.
- You can clone a ready-to-go project at HPTK-Sample.
- Unity 2019.4.4f1 LTS, 2019.3.15f1
- Oculus Quest 1/2 - Android
- Hololens 2 - UWP
- UnityXR-compatible controllers with:
- Index trigger
- Grip trigger
- Primary 2D axis
- Universal Render Pipeline (URP)
- Standard RP
- Obtain HPTK.
- Import Oculus Integration.
- Configure Build Settings (Oculus Quest).
- Configure Project Settings (!).
- Setup a scene with hand tracking support (Oculus Quest).
- Setup HPTK specific components.
- Setup platform specific HPTK components (Oculus Quest).
- Modify/Create HPTK Configuration Assets (if needed).
Checkout the Wiki for a detailed step-by-step guide.
The Wiki also includes more details about:
- Modules overview.
- Getting started with HPTK.
- How to build new HPTK modules.
Jorge Juan González - HCI Researcher at I3A (University of Castilla-La Mancha)
Oxters Wyzgowski - GitHub - Twitter
Michael Stevenson - GitHub
Nasim, K, Kim, YJ. Physics-based assistive grasping for robust object manipulation in virtual reality. Comput Anim Virtual Worlds. 2018; 29:e1820. https://doi.org/10.1002/cav.1820
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016 https://blogs.microsoft.com/