Clarifying Near IPD x Distant IPD confusion

Honestly, I do not know, what exactly Pimax does when Parallel projection is on. What is sure is that the (Pimax) compositor is getting images from the app, which were rendered for coplanar views and the only thing the Pimax driver can do is to map them to the canted panels, but it cannot change the angles at which the scene was rendered. So technically, you might have a point and looking at a scene which has been rendered with parallel projection gives less eyestrain, because it feels more natural to what you expect.

1 Like

What PiTool REnder Setting are you using and then corresponding SteamVR / Ingame render settings? If you are using PiTool REnder Setting of 1 then yes, near objects sharp but distant objects blurry. Try 1.75 or 2 in PiTool. Nothing to do with IPD in this case, just the render settings.

At least this has been my experience with Il2 BoS and Pimax 8K. I can see aircraft out at 8km distance with the right settings. Less than 2km with the wrong render settings.

1 Like

Thanks @blitze, will try that tomorrow.

I guess that if I would use Pi Render > 1, there would be some performance hit.
I’ll check that out.

Thanks

@risa2000 I am talking to doc_ok on reddit about this issue some users report about subtle eye strain (he’s always been super helpful in the community, a well-known VR researcher) and I figured you might be able to to answer some of these questions. I was trying to get his expert opinion on this. Can you tell me if sweet spot indeed on optical axis with the canted screen and help me better understand if Pimax is doing these things? I think that’s what you were saying in original post but I had a little bit of trouble understanding some parts.

Doc_ok:

"Hi,

I’ve never held a Pimax 8k and haven’t followed the hardware in detail. If I understand you correctly, it has a physical slider to adjust the position of the lenses, but the screens are fixed, so the slider moves the lenses relative to the screens?

I don’t know how the lenses are supposed to be adjusted. Are you expected to shift until you are looking through the lens centers when looking directly forward, or is the “sweet spot” set so that your pupils are on the optical axes of the lenses, which would be farther apart as the lenses should be parallel to the screens, and therefore canted by the same angle. Intuitively, the former makes more sense to me, in which case physical distance between lens centers should match software IPD, but not having tried an HMD with canted screens/lenses personally, it’s possible that the latter ends up being better, in which case software IPD might be noticeably lower than lens center distance. I would have to try this myself.

In principle, either way it’s done shouldn’t cause problems if all internal HMD parameters are updated correctly in response to physical lens movements, but to be honest, I don’t trust the Pimax people to exactly know what they’re doing (based on their track record regarding HMD calibration). If you are seeing convergence issues or eye strain with different lens distance settings, it’s possible that they are not updating the per-eye view frusta correctly.

In headsets like Rift and Vive, where the screens and lenses move as units, IPD changes only affect the frustum offset. But if the lens moves separately from the screen, IPD changes also change the left/right frustum planes, i.e., the frusta become more or less skewed, and as the lenses’ optical axes are also moving relative to the screens, the lens distortion correction formula has to be shifted as well. If the Pimax people forgot any of those updates, or are doing them incorrectly, or if any VR software isn’t flexible enough to react to frustum and/or distortion correction changes, you would definitely see mismatches and experience issues."

Ps. I did offer him to borrow my 8k a while ago but think he forgot. I think I will ask him again :wink:

@anon74848233 do you guys think you are doing these things that doc_ok has mentioned with the frustum planes and distortion correct formula shifting, and all of these other things that seem to be required when you have non-moving screens?

5 Likes

Where do you talk about it? I would answer Doc_ok myself directly on reddit.

1 Like

If we’re speaking with doc_ok, be sure to point him (in addition to Risa’s fine detective work and deductions), to the configuration sjefdeclerk pulled out of pimax’s software:

https://community.openmr.ai/t/pimax-distortion-editing/11493/3

I don’t think it is accessible as complete blocks of raw ASCII in the binary any longer, but suspect it is still built on the old DK1 stuff Oculus open-sourced (…probably thus getting Pimax started in the first place), though, in which case it should be immediately recogniseable to somebody who knows their stuff. :7

We need an OFFICIAL statement about this. At least some explaination how you calculate the IPD right for their optics.

Because which customer would dial in 60,4mm, when they actually have a 68mm IPD? At least thats a point where i wouldnt feel confident about pimax.

3 Likes

Yeah - I initially thought upping PiTool render would be a big hit on my system too but in testing I found it to be the opposite.

Push PiTool up and reduce Steam Video Render and you seem to get better performance for a given Render Resolutiopn target. That and clearer but again, pending on the resolution target and it’s relation to the native resolution of the headset.

Just be aware that using fpsvr shows that steam ss does not stay at the correct ss setting when you increase pitool.

1 Like

That could be the case. A lot of my initial experimentation was to try and get the Pimax 8K working with a 1070 Max-Q GPU, Things seem easier with the RTX2080 as it is more capable and I can now leave SteamVR Video on auto and App at 100% and get a reasonable output.

It will be interesting to see what comes down the line soon with the PiTool update and also if Foveated Rendering works and if it does, then how much of an improvement in rendering speed it will deliver.

I might play with both Foveated Rendering and Smoothing but keeping PiTool at Render value of 1. Most likely my issues have been pushing PiTool above 1 with those enabled leading to degraded image issues.

I am noticing that even when I set 5K+ to be the infinite distance IPD of my eyes I still get a bit of disjointed images especially up close. It seems to only converge properly at distances at least 2 meters away.

Overall I am quite disappointed that in order to avoid eye strain we have to have a blurry picture as well as having only one eye sharp when looking side to side with your eyes.

Is it possible to get different lenses that place the lenses parallel to the eyes instead of parallel to the screen and fix the distortion with the lenses and image distortion profile? It’s obviously too late now and we have an inferior product but maybe someone can implement this properly?

I had been planning to do the same (once I finally get my headset). I don’t have any BS/controllers yet so I probably wouldn’t be using it for a few months anyway. :grin:

1 Like

I am not sure if you already answered or how it went, so just for the record, I place my answers to Doc_ok here:

Right. Screens are fixed and lenses are adjustable in the plane parallel to the lenses. Software compensation is done by rendering the warped image at the position which corresponds to the IPD the user dials-in on the HMD.
You can check here, how the warped images look like and that there is enough margin on the panel to cover the complete IPD range. https://community.openmr.ai/t/pimax-native-projection-and-pre-lens-warp-transformation/15775

The second option is correct. The pupils are supposed to be on the optical axes of the lenses, so when you look through the lens center, the optical axis is perpendicular to the lens plane and to the panel plane. The angular shift of the panels and lenses from the direct view ahead is 10° on each side outwards. https://community.openmr.ai/t/some-thoughts-on-the-ipd-discrepancy/14754

This is the actual observation many made that the former is not only intuitively better, but also does not strain the eyes as the “divergent” design does.

There are several problems which stem from the current design, and I am not sure they can be easily mitigated even if distortion profile is good.

  1. When looking dead ahead the eyes never look through the lenses centers, and they do not look through them at the optimal angle (the axis). This is observed as image “not clear”. I do not know if it can be solved without moving or reconfiguring the optics.
    But since it is possible to get the clear image simply but turning your head (10° to right or left), it brings a distraction in the normal usage - I can have one or the eye in focus, but not both.

  2. By moving the lenses closer it is possible to mitigate the unclear image, but it makes the peripheral distortion even worse and it also puts out of balance the angular fidelity, because the images are still being rendered (without completely changing the configuration, which the headset reports to the OpenVR) at 10° angular shift, but they are now observed at a different angle.

  3. Because of the divergent nature of the views, the correct eye position is not only sensitive to the correct IPD setting, but also to “eye depth” (in the skull), because the system can only be calibrated to one eye depth. Technically, people who have eyes deeper, need to set higher IPD (to preserve the convergence of the view at the pupil) and vice versa.

I believe Pimax chose divergent views because they wanted to target large FOV and the divergent views were the only (relatively easy) solution. This however introduced the problems I described above. If there is a solution to the current “one eye sharp, one eye fuzzy” dichotomy I would be happy, but I am not sure Pimax is capable of pulling it off.

2 Likes

This is actually a very good point that I didn’t realize! The distance of your eyes to the lenses must match the expected distance by the calibration software.

On a single panel like the Vive your distance to the headset will not change what you see, but on the Pimax the farther your eyes are from the lenses the smaller the angle between the optical center and the infinite view.

This leads me to a hypothesis that reducing the distance between the lenses (lower IPD on Pimax) and bringing your eyes closer to the lenses should get you closer to the optical center and allow you to see more of the panel clearly. If you do this in conjunction with modifying the image calibration such that you get the correct 3D overlap, you may be able to improve the overall situation.

Vive has 2 panels. but they are flat in relation to the lens

2 Likes

So the IPD lens adjustment does not also move the screens? :astonished:

Surely it’s better if both of these move together. No wonder people with narrow IPD are having issues.

The panels do not move, but the image which is displayed on them does. It is displayed at the position on the panel based on the IPD the user dials-in. So the final effect is the same. You just need to have enough margin on both sides of the panel to accommodate the range.

2 Likes

I would love to know how far you could see aircraft with a 5k+.

I have just been setting up for IL2 now that I have a card that can run it. Just discovered my IPD is distance 58mm but setting 5K+ to minimum of 60mm it looks horrible in the hangar and the same in flight. I set it to 63.5mm and it looks much better. I have used thicker foam at the bottom of the 5K+ to get thing in focus and keep off the nose bridge.

how did you measure it?
lately i tried to use a smartphone selfie and got much lower value (~5) as before with the method using a mirror