details to be investigated:
Babylon includes hand meshes in the package. they are right/left hand models, based on the WebXR specs found here: https://immersive-web.github.io/webxr-hand-input/
If you are able to manipulate the mesh using the 25 points required for WebXR hand tracking, you can āemulateā it.
the code that adjust the hand meshes using this data is here: https://github.com/BabylonJS/Babylon.js/blob/master/packages/dev/core/src/XR/features/WebXRHandTracking.ts
there is hand-post estimation in the app based on the technology described here: https://blog.tensorflow.org/2021/11/3D-handpose.html it can be tested in the app here: https://chat.positive-intentions.com/#/hands
the result of the models estimations is that there is a 3d point-cloud of the pose detection.
in theory it is possible to get the handpose-estimation mapped into the BabylonJS augmented reality as seen on the app here: https://chat.positive-intentions.com/#/verse
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too