Need FOV angles for interop between WebXR and OpenVR
ognkrmms opened this issue · 1 comments
Hi,
I am developing an application written with OpenVR that video streams to WebXR frontend. I know projectionMatrix must not be decomposed into raw angles as mentioned in #461 and #575. But to the best of my knowledge, OpenVR works directly with FOV angles (SetDisplayProjectionRaw). How can I circumvent the limitation?
To clarify: Are you taking values from OpenVR and pushing then to WebXR? Because there shouldn't be any conflict in that case, you can simply compute the projection matrix from the FOV angles. Chrome did this for a while, before we fully switched to an OpenXR backend. (The spec text is to warn developers using WebXR to not decompose the matrix. WebXR implementations are free to determine the matrix however they see fit.)
If it's the other way around, though, and you're feeding values from WebXR into OpenVR (which I'd like to know more about! I'm not sure how that would work) then you're probably stuck decomposing the matrix anyway, despite the warnings in the spec. The logic being that the primary reason for not decomposing into angles is that the matrix may contain shearing, but if it does then that headset is fundamentally incompatible with OpenVR anyway. (Or at the very least cannot accurately describe it's projection to the API. How disruptive that will be depends on the hardware/optics.)
So decomposing the matrix is the most correct thing you can do for interop between the two APIs, and should work well on a large number of devices, but is not guaranteed to work universally.