Closer look at Pimax parallel projection

I’ve started poking at some code and managed to extract the IPD, cant angle (10 degrees each eye) and hidden area mesh (turns out Pimax only reports the normal form, covering hidden regions, not the circumference or inverted forms). I’ll try to set it up in a Blender hierarchy to model it all.

Also found a typical example of the broken code that’s discarding the view rotation: rust-webvr extracts only the IPD in fetch_eye_parameters. This is the type of code that breaks with canted rendering; just keep the transform intact and it all works.

3 Likes

I am looking forward to it since, as a matter of fact, I did my visualizations in Blender too. But I never used Blender before and it has a bit steep learning curve so my vis are quite basic at the moment :slight_smile:.

One may just wonder why, when OpenVR provides the complete transformation, the devs go all the way of mocking up a bastarded version, only to break it.

3 Likes

I am trying to modify Valve’s HelloOpenVR sample to draw mesh. If that is useful, I could share the result if I succeed. Then, mesh is visible in Steam VR mirror.

EDIT: Just realized if the goal is to see the mesh, it is already possible with games that use it. Just get Raceroom (it is free) and enable SVR mirror.

2 Likes

(My recollection of the config data Sjefdeklerk dug out of an older build of piserver, was that in it the mask was appeared defined as one discrete triangle for each corner (overlapping where needed), in normalised coordinates, with a single digit number defining how much they should be subdivided. Don’t know how the subdivision and splining (to make those hypothenusea concave) works there (EDIT2: …maybe simply Bezier toward the right angle corner, or something…), but reckon the output spat out by SteamVR should be few enough numbers (polygon edges are certainly very visible in the desktop mirror display), to plot by hand (…and then project onto the PP plane (…whose height needs be no taller than its top and bottom edges just brushing the visible area perimeter at its tallest point (EDIT: Wonder if that’s what Pimax’s software engineer on the task has set it by…))), but I very much understand a desire for automation, which would make the procedure easy to repeat. :slight_smile: )

1 Like

In this modern age, we no longer need to dig in pi_server. Instead a call to a friendly function IVRSystem::GetHiddenAreaMesh gives all what is needed. After a quick look, it seems that the function returns 40 triangles, 10 in each corner to form the complete mesh for the mask.

3 Likes

Sigh These new-fangled ideas and concepts… :slight_smile:

1 Like

So I have a question…

I currently am pretty close with a developer working on VR support for their game. I helped them demo their game last year at a convention with a Vive and I will be helping them again this year with TWO Pimax 5k+s at a convention in August. The game is Titanic Honor and Glory. If you’ve tried their demo you know its extremely performance intensive, even more so on Pimax. This is exacerbated by the fact that it requires PP to be on to run. I plan on discussing Pimax with them when I get to the convention and I’d like to talk about PP but I’m not really sure what PP is doing and why its needed for some games.

What should I say to them in order to explain why this is happening in laymans terms and how to fix so they wont require PP when they game eventually gets released?

(Sorry if this is kind of off topic)

2 Likes

Incidentally, I backed that game and have a deck under my name. Hope all is well with the project. Good to know someone in the forum (who is familiar with Pimax) is helping them. Cheers for that mate!

1 Like

What you should tell them is easy to put into words: They need to correctly handle the eye to head transformation provided by IVRSystem::GetEyeToHeadTransform in their game. This transformation describes the eye geometry, i.e. the eye position (=IPD) and the view rotation (=display+lens rotation).

Now the difficult part might be, what exactly correctly means as I already witnessed that it is not as obvious as I would hope, but I guess this is something that (if needed) can be addressed later.

3 Likes

Here’s my crude attempt at comparing render buffer fields of view, using the field of view data. Can’t help feeling the parallel projections are too high, the mask regions seem suspicious (can’t produce the same results, so likely mask too little and in the wrong shape), and I should probably double check if the projection matrices agree with the reported fields of view.

Pimax 5K+ fields of view translated to Blender
You’ll need Blender to open the file.

As for supporting the canted displays, it’s a minor detail in the setup of the VR cameras and should be handled by the engine or its VR addon (I believe TH&G uses UE4). Another thing to keep in mind is to avoid screen space; things like SSAO and screen space reflections are prone to producing binocular conflict, and things like HUDs just end up out of view and unreal. It’s really immersion breaking to have to look straight through a “real” object to read some caption.

2 Likes

When I did my render, I had to scale down the projection planes along the Z-coord (in OpenGL - not Blender - notation) to get them close enough to separate the viewing frustums. I would suggest you do that too, or get much larger IPD (for visualization, it does not matter anyway).

The other thing I did, I scaled the PP plane and canted plane in a way they share the inner vertical edge (it is possible because the projection geometry for PP and non-PP has the same vertical planes of the projection “cone”).

These two things made the final image much clearer. Technically, this way you should get the same shapes I did. I checked your meshes and they seem OK.

I dumped the meshes myself too (but did not get to render them into the views yet) to find out that small and normal ones are the same for both PP and non-PP modes. The large one is different and differs also for PP and non-PP.

This alone suggests that Pimax did not really bother to think much about them, because by principle the meshes cannot be same for PP and non-PP modes if they should mask the same viewing area.

Just for the comparisons, here are the meshes. From top to bottom is small (=PP small), normal (=small=PP normal), large and PP large. The last one is missing outer tips.

3 Likes

For what it’s worth, I took the liberty of trying to extrude one of those masks (normal FOV canted, scaled from the object origin), to where it would intersect an orthogonal-to-global-coordinates plane hinging off it, and framed it in a bounding box, and it came out like this:

Clipboard08

3 Likes

Exactly; that’s the sort of perspective projected outline I would have expected. In particular the angled top and bottom form a large chunk of the parallel projections that shouldn’t be rendered and is currently hurting performance. That’s a serious performance hit for parallel projections that didn’t need to be there, assuming our outline was correct, which I’m rather doubting currently. Another flaw is how the inner edge hidden area grows with field of view. This mesh might help, but it is clearly not correctly matched.

2 Likes

Here are basically the same renders as in the original post which instead of the solid projection planes render the hidden area mask meshes.

The colors are same as in OP, i.e.

  • Blue: Hidden area mask (HAM) in the native view.
  • Green: The native HAM projected on parallel plane (what @jojon did here)
  • Red: Pimax parallel projection HAM

The mixed colors (yellow, cyan, magenta) are the areas where the respective meshes overlap. It is evident that Pimax HAM for parallel projection is not right on the inner side. The outer side is not so bad.

Small FOV

Normal FOV

Large FOV

8 Likes

First of all thank you to @risa2000. At least to some extent, I now understand why this option “parallel projections” is necessary for the pimax. That there are games for which money is required, but have no support for HMDs with angled displays, is an absolute impudence, in my eyes!

Nevertheless, there are 2 things I do not understand:

  1. Why can’t the red layer be rendered in the same resolution as the blue one? (Because the higher resolution is what requires more GPU performance.)

  2. As far as I know, the Valve Index also has angled displays, but all games run without an option for “parallel projections”. How is that possible?

I am not sure I understand your question. The red layer is what Pimax defines as the “canvas” for parallel projection rendering. The blue is the one for native rendering.
How blue and red are (relatively) related shows actually the green one, which is the blue projection into the “parallel plane” of the red, so you can directly compare them there (area wise). If by “resolution” you mean actually the size of the different areas, then the size of blue and red are defined by the headset (Pimax).

Valve Index uses parallel projection implicitly (without user’s knowledge). I believe there is a way to turn the native mode on, but one needs to change some config files manually (maybe some Index owner could correct me).

3 Likes

Yes. I understand it this way: If PP is off, the pimax software will let the applications render directly to the blue layer. When PP is turned on it will let the applications render to the red layer and then the pimax software will transform that into the blue layer. But it would have to be possible to render in the red layer (even though it is “bigger”) simply with the same resolution that would normally be rendered in the blue one.

OK. But that would actually mean loss of performance. Of course, it is clear that the FoV of the index is much smaller than that of the Pimax.

Ok, I misunderstood, your understanding is correct. Technically the blue and the red cover more or less the same angular area (with the red losing a bit in the outside corners, but rendering quite a lot of the unused image in the inside corners).

If the resolution for the red an the blue will be the same then the user will experience the degradation of the resolution across the horizontal FOV when coming close to the inner edge of each view, because there, only approx the 2/3 of the red pixels will be seen. So Pimax must oversample the red enough to compensate for this pixel loss (and also for the parallel projection transformation).

2 Likes

You are quite right!

"steamvr": {
   "renderCameraMode": "[parallel(default)|raw]"
}
2 Likes

@risa2000 Yes, that’s right. I forgot that the image rendered on the red plane has the “shape” of the green (as far as possible)