Why do some games require parallel projection but others don't?

What are the main things a game does that makes it require the use of parallel projection, such as Elite Dangerous, Project Cars 2, etc? What makes them different from games that don’t require it? Is it the way it is implemented? Is it because those games don’t support OpenVR?

I’m trying to find out the reasons why so I can hopefully solve getting rid of the need for parallel projection by modding certain games.

Yes, it’s “the way it is implemented”. Basically, it’s because the games which require parallel projection assume both eye panels are straight ahead and ignore any angle.

That is, they don’t fully use the API return info for the per-eye rendering transform. For most headsets, that’s a shortcut which provides a valid result, but not for a headset with canted (angled) panels. (It’s not dissimilar to hard-coding the object culling angle for narrow-FOV headsets, back when there were no wide-FOV headsets.) It’s a programing shortcut that’s no longer valid.

Actually, most (all?) Oculus games don’t need PP. I think it’s associated with how programmers implemented the OpenVR API setup calls.

The fix is to update the per-eye “camera” transform code. I’m not sure you can mod a game that you don’t have source code for.

Here’s some more discussion, in the context of an original Quake VR mod…

https://community.openmr.ai/t/how-to-enable-native-pimax-wide-fov-support-in-games/28420?u=neal_white_iii

6 Likes

Some other details of note:

It’s not just games making this assumption, so did nvidia in the 10 series of GPUs. They only permitted one variable for simultaneous multi view rendering, which turns out to be your IPD when used for VR.

It’s actually easier to not require parallel projection, because you get complete view projection matrices from the APIs. Therefore it’s mostly specific engines that do it the clumsy way, sadly including Elite Dangerous.

A very few programs have a reason to do the rebuilt projection matrices, such as how (IIRC) Space Engine turns the Z buffer inside out; it’s absolutely possible to do from the matrices, but slightly trickier.

It would be possible to patch games having this issue, but it’s a challenge. Stereoscopic conversion wrappers, like Vireio Perception, do a similar thing.

6 Likes

Ah, now it makes sense as to why some games disregarded part of the info returned by the API. That’s what you needed to do, to take advantage of the nVidia 10xx multi-viewport rendering feature.

Thanks for the info!

1 Like

I guess you might be confusing projection matrix (which defines the perspective projection frustum) with view matrix (which defines the (virtual) camera pose).

The apps, which do have problems with non-parallel cameras (i.e. non-paralel projection in Pimax speak) are actually misusing (or ignoring) the latter.

I doubt that the games really used Nvidia single pass stereo (SPS) rendering, which has been introduced with Pascal series (GTX10x0) cards. There are however games which use optimized single pass rendering for different optimization reasons (Beat Saber for bloom effects) and it is usually non trivial to change it to generic non-parallel cameras implementation because the single pass stereo optimization would not work in this configuration.

So there is always a trade-off involved in going from single pass stereo (with optimization benefits but requiring parallel projection) to proper non-parallel projection (cameras), but giving up the optimization. Since no one is using hardware acceleration for that (AFAIK) this is usually tough call (plus requires resources to reimplement significant part of the engine).

8 Likes

GetProjectionMatrix. The 3x4 matrix produced by GetEyeToHeadTransform doesn’t contain the Z manipulation for the perspective division.

GetProjectionRaw also has this problem, as it produces the variables required for gluPerspective, which assumes the Z axis is parallel with the display normal (and that the center of the display is directly ahead of the eye). These assumptions held unusually well for headsets like the Vive (dedicated display panel centered straight ahead of each eye), but not Pimax or Index.

I may be a little mistaken on how these pieces fit together. It could be that the ComposeProjection example in the GetProjectionRaw documentation could work correcly when combined with the GetEyeToHeadTransform, because it has enough variables to convey how off-center the render frame is (while GetEyeToHeadTransform can contain the rotation). Both gluPerspective and Microsoft’s sample ProjectionMatrix assume centered displays, however. D3DXMatrixPerspectiveOffCenterRH doesn’t.

6 Likes

You’re right. Sorry, I’m used to thinking of them in one lump, as they’re typically just multiplied together before being fed into the hardware.

4 Likes

After the main culprit (which has already been mentioned) has been dealt with, games can still get other things wrong, if their shaders, or e.g. object trackers, make assumptions about the orientation of the viewplane (as seen in e.g. some shadows and reflections - and various screenspace effects more often than not), or even just from tonemapping being handled for each different-part-covering eye-view on their own, as can e.g. be seen in Elite:Dangerous, when you turn toward a bright star, and the near eye view begins to dim some considerable number of degrees before the far one does (EDIT: …once the panning star reaches it and begins to claim screen real estate).

I suppose shaders, specifically, is something a very crafty individual could possibly extract, reverse engineer, modify, and substitute, using tools which I understand exist for that purpose… If nothing else, we do have user: “Old Duck” over at the Frontier forums going absolute bezerk at least disabling choice EliteD shaders he find unsightly. :7

6 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.