Elite Dangerous - Non-Parallel HMD Display Support - PLEASE VOTE

Hi Pimaxers,

Please add to and vote for this issue with Frontier Developments. Let’s get them to fix it for a much better VR experience:



FYI, we already have a thread for this. Thanks for the bump though.


1 Like

Already voted with my three commanders.

1 Like

They have acknowleged the issue, so that is good. There were 23 acknowledged issues before the last patch. Now there are 18. (So they fixed 5?)

So hopefully they are at least considering looking into this. It is in the top few when sorted by votes. But not the top, so we need more votes still.

This issue makes me think it’s a good time that we start getting pimax specific versions of software from Developers. If you could have more than two viewports to render from oh, you could get rid of all the distortion that the lenses is presently suffer from, and it would make this headset so much better.

You put “Multi View Rendering” into the title and then wrote about the parallel projection. So are you interested in MVR or in fixing the parallel projection? Or do you suggest that the former should be used to fix the latter? Right now it is confusing.


Follow the link to the Frontier issue ticket and you will be un-confused.


1 Like

Considering I was between the first who confirmed the ticked and even took the liberty to explain to FDev what they were doing wrong, I guess I am quite unconfused about the ticket.

My point was that MVR is something currently only supported on Nvidia RTX cards, so if you are suggesting that they should fix it by using MVR (which you have neither confirmed or dismissed) it will not help to anyone else who does not have RTX card.


Since RTX cards are becoming more prevalent I will let you decide on how much further you’d like to un-confuse yourself.

voted (30 characters)

1 Like

voted also for the issue

1 Like

The ticket isn’t about MVR. MVR is an approach that (I think?) could be used to fix this but it is not the only way, and it would not work on non-20-series cards. Using GetEyeToHeadTransform properly instead (again, I think, I’m not an expert) would work for many more people on more cards and would also come with the performance boost for all. This is what most / all games that don’t require PP do.

This is where the confusion comes from. If you want MVR you may need a new ticket. Not the one you linked.


Changed the topic to make everyone happy. :stuck_out_tongue:

Has pimax ever released a guide for developers that explains how to fix the problem? With more general info and also specific guide for Unity and UE4

Often there are new games with this problem, the dev is nicely available on Discord, but they don’t even address the problem if they don’t know anything about it


@risa2000 already posted that multiple times, but the fix is pretty simple (unless game really does something unusual). What needs to happen is this API used as intended:

Eye matrix needs to be inverted (and API comment explains that). On in plane HMDs it does not matter, because their rotation eye matrix is identity. But on Pimax, camera should be rotated slightly. When done correctly, same code will work with both in plane and canted display HMDs.

Above will fix Steam version of ED.


Thank you, post saved. However, I believe that a more informative document published on Pimax website would have greater authority when forwarded to a dev, rather than a user generated content on a forum

in addition, if the solution were proposed directly in the issue opened to Frontier, maybe the devs would take it more into account


Maybe Pimax should send David Braben a headset.

1 Like

You are right that I posted it already several times, but I did not say that fix is pretty simple. In particular:

This would not work even with parallel views (Vive, Oculus, etc.) because the inverted matrix also inverts the eye positions, so the game will be rendering the right eye view into the left eye and vice versa. There must be something else what is also wrong. So I do not believe that the fix for ED is that easy.

The solution is probably more complex, but the root cause, i.e. the incorrect (or none at all) use of the eye view transformation matrices is clear, and I was mentioning it in my confirmation of the bug (it is written at the first page of the bug confirmations).

My speculation, why it does not work in ED is that ED uses some rendering “hacks” in VR to save the performance. For example (just a speculation) some effects are rendered in a single pass for both eyes, instead of rendering them correctly for each eye geometry individually. This however stops to work when the views are no longer only shifted, but also rotated, and the fix might need rewriting the parts of that code, or even might not be possible without significantly impacting the performance.

Why do I think that is the Beat Saber example. This game runs in Unity and has the geometry correctly rendered in both modes, PP on, and PP off (which also confirms that in general Unity should support Pimax native mode fine).

The problem is only with one particular effect (bloom) which is however applied to almost all objects in the game. This effect is rendered in a single-pass (I was able to debug the released version of the game with the help of @brian91292 to the point I can tell exactly which function in the game code is wrong, and why).

Now, the BS dev are facing a difficult situation, they need to either rewrite the incriminated shader (which is the culprit), or modify the game in a way it no longer uses this “hack” but does it properly with the risk of the severely impacting the performance. I am not even sure the former is possible at all and the latter might not be acceptable.

Needless to say that ED does not use Unity (so does not have even the geometry correct) and is much more complex than BS.

So yes, it is “easy” to see what is wrong with the game, but the fix might be much more complex.


Correct, but there’s a way to just negate the translation part, without inverting the rotational part :slight_smile:

I can totally imagine game not calling that API at all. You just create two cameras offset, and transform them with the device absolute tracking that Open VR provides. But there might be other ways of doing things.

And I agree, one of the possible complication is post processing effects. Depending on when/how are they applied, things might be not trivial to change.