-
Notifications
You must be signed in to change notification settings - Fork 385
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for 'palmspace' #1310
Comments
I guess I have been using grip space as a fallback for this, it would be handy to have it for both hand and gamepad to make it more reliable to pick up objects |
What is group space?
A potential issue is that this space is designed for the default controller model ie I'm unsure if it's the same for every quest controller (but I can check) |
I meant 'grip' space (I edited after) As in the controller model can still use grip space, but for items of potentially variable size this would be better. The way I have been working is to make all models with handles match the controller sizes so they get held nicely but if there is a space that sits flush with where the skin would be there can be some more flexibility |
Removing label, because it was discussed at the last meeting, if this is in error or more discussion is needed please reuse '/agenda' |
Potentially related: https://twitter.com/Mitsuownes/status/1630355532652376065 I wonder if we should plan for the day when the controllers are going to approach full hand-tracking in capabilities. I could see exposing a grip space via hand-tracking even now w/o needing a spec change - it would be confusing for existing sites that probably expect a finer-grained level of tracking, but there might be a point at which we decide that a controller is better-suited to expose this data via an XRHand. |
The Quest touch controllers already sense finger touching (and positions in case of the the Pro controllers) |
My point is more about the fact that "where the controller is held" (or rather, where we expect the controller to be held, unless we actually have sensors on the controller that can report the exact palm pose) can also be surfaced as the XRHand, and I can see it become more accurate in the future (& more appropriate e.g. if the controllers start exposing the state for all the fingers of a hand on the controller). To me the noteworthy thing from the video is that it seems that the virtual palm gets slightly adjusted based on finger touches, even though the real hand's palm doesn't seem to have this adjustment. I'm assuming hand rendering in WebXR has to be driven entirely by the site based on the palm space and the input source state? I'm approaching it from "the use case is rendering a hand" (is that the actual use case here?) and trying to figure out if we have other options here. Naively, I think it would be easier to render a hand given an XRHand instance, since this could be made to work both for apps using hand-tracking and controllers (assuming we expose XRHand for controllers too), and seems to be future-proof. It does place more burden on browser implementations though. |
OpenXR added support for a palm pose. This allows an experience to figure out where the user's hand/palm would be when they use the standard OS controller.
https://youtu.be/g-zupOaJ6dI?t=392 has some great visualizations that show the issues that arise when this is missing.
It seems that the explainer is basically describing this for grip pose but the spec has the correct language for grip.
Should we add this as a new optional space?
/agenda Add support for 'palmspace'
The text was updated successfully, but these errors were encountered: