Optical Calibration: In depth Info on Hardware and Software IPD function in newest PiTool Version

I would like to understand what happens in Software when using the Hardware Wheel vs the Software IPD Slider in newest PiTool Versions (1.0.2.86 or 1.0.1.266 and above).

My current understanding and observations are:

The software IPD moves the two images further apart (increasing minus values (or decreasing positive values??)) or closer together (decreasing minus values (or increasing positive values??)). It has a strong effect on the stereo separation and world scale perception.

Moreover, virtual objects and objects in real life should be in sync in respect to accommodation of the eyes to a focal plane of same distance/range. Minus value of H-offset increases the depth perception.

The hardware IPD wheel moves the lenses physically and the Image seems to jump whenever the overlay of the IPD appears. I guess the image follows the lenses as the screens itself are not moved physically so it is implemented by software.

From this you could conclude that the Soft IPD and Hardware IPD do the same manipulation to the image, the wheel just moves the lenses in addition to the image on screen.

But that is not what I observe. The wheel does not have any significant influence on the stereo separation or world scale when e.g. moving from 61 to 65 IPD. I remember this was different in earlier PiTool versions (not sure on this)

However, it does have an impact on the distortion profile/perception. The edge distortion is significantly reduced at IPD 61. from 62,5 onwards it is clearly visible and increasing towards 65. My real IPD is 64.

So, I would like to use IPD 61 in hardware as it eliminates distortion. Then I use around -0.7 H-offset to get the correct convergence accommodation described above (in Sync with real life).

Unfortunately, e.g. -0.9 or -0.6 also look fine and in Sync, but the values have significant impact to the world perception in VR and I am stuck here, as every value does not seem perfectly right. Mostly the world scale seems too big.

When you use the head set for some minutes, every value seems right (-1.5 or -0.3) as long as it does not hurt the eyes. The brain adapts quite fast to the settings and I need to wait some hours or a day to continue my “optical calibration”.

When I use the Vive Pro or Index, everything feels right in the first second when you put it on. With the Pimax XR it always feels little bit odd in the first seconds to minutes and I try the next offset value!

And most important: when I use IPD 64 on the wheel and H-offset 0, it is totally off and definitely not correct.

Can anyone help me with this? I am trying this since months and don’t play any games since I am a Pimax user!

I read everything I could find on the internet about this but have not yet a final solution for my problem.

It would be nice if someone can explain in depth the function of the PiTool software and what it really does to the image. Does the software IPD e.g. only shift the rendered images or does it shift the position of the camera for rendering the image. The result would be slightly but significantly different.

4 Likes

Thats a very good point and observation and coincides with my experience as well finding myself fiddling around again and again. Funnily enough I never understood the fuzz @mixedrealityTV was making about the distortion when I was still on my 8K- until i got the 8KX where it clearly is noticeable on even normal FOV and I am constantly fighting for the best compromise. I do not know if the lenses of the 8K were infact superior in that concern or this falls back to a change in the Pitool. (digging out my 8K and make a comparison was on my agenda anway).
However from my 8KX experience I feel this could be improved if the mechanical IPD would only control the physical distance of the lenses, IPD correction shift the image and an additional convergence slider to adjust the camera angle was introduced.

3 Likes

FWIW my Pimax 5k+ behaves like this:

  1. Setting hardware IPD:
    a) moves the lenses,
    b) sets the virtual cameras positions to correspond to the IPD

  2. Setting software offset:
    a) just (off)sets the virtual cameras

How both affect the pre-lens warp function is difficult to say (guess). My estimate would be that the warp is tied to the lens position so it only changes in 1), but I did not try to prove it really.

Setting either 1) or 2) (i.e. the virtual cameras “pupillary distance”) also defines what you call “image separation”, although it is not a parameter per se but rather a consequence of the pre-lens warp transformation. In other words, where on the display the image is displayed is pretty much defined by set IPD and the optics.

4 Likes

It would indeed be a good idea to have an additional slider that only has influence on the warping profile. I don’t think it is only because of lens movement, something is also done in software while using the wheel and I think it is different from what happens when you use the software offset slider (more on this in my next message).
There are already too many possibilities to screw up the optics and color calibration, but since Pimax Headsets come screwed up to the customers this is what is required unfortunately. I had 4 so far and every Pimax had a different optical alignment problem. For one Headset I had to apply a V-Offset of 2.5 for one display (R -2.5 and L 0) as there was a physical misalignment of the hardware!
It would just be very helpful if there was some technical documentation from Pimax to explain what really happens in software when moving a slider and when to use what slider to solve whatever problem.

1 Like

I think what happens in software in point 1b of your post compared to point 2a is different.
The change in “side distortion” is very noticeable between IPD 60 and 63. The Offset slider does change stereo perception but not the warping at all.
Maybe it depends on your face shape and eye position to what degree it affects the image. It also depends on how close you are to the lenses. I use the Comfort Kit on all my PImax Headsets and between thin and thick facepad there is already a huge difference. When moving away from the lenses you lose a bit of vertical FOV and the side distortion is less. However, at some point you get barrel distortions more towards the center of the view. Therefore, getting further away is also not the perfect solution.
It is hard to find the sweet spot for distortion, but at lower hardware IPD it is much easier for me. I just don’t know what else is influenced by the low hardware IPD and how to compensate!
As the world scale never feels right in the Pimax (as compared to my Vive Pro and Index), there is also not a visible control for me to see what is off and what is more off.

Since I feel this is very important for all Pimax users to understand this, I just would like to explain a little bit better what I meant with:
“Does the software IPD e.g. only shift the rendered images or does it shift the position of the camera for rendering the image. The result would be slightly but significantly different.”

If you render an image with render view (camera position) 6 cm away from each other, “freeze” the images and then shift these images on the displays so that the center of the image is now e.g. 7 cm away instead of 6 cm, you don’t get the same result as when you shift the camera positions prior to the render 7 cm apart.
You can easily try this when closing one eye and look into a (real life) room with a lot of objects at different distance in a direct line of view. The objects shift closer or further to each other when moving the head even slightly. Closer objects might even occlude certain objects in one head position and when shifting the head 2 cm you will see them again. The world is moving in paralax.
A already rendered (frozen) image only shifted on the displays does not change the paralax of the objects!

My assumption is, that the wheel has influence on the camera render position of the images and the offset only shifts the already rendered image on the display. If that is true, we should only correct misalignment of the hardware (alignment of the two displays in x or z position) with the offset sliders, but not correct the IPD from what is applied at the wheel with the offset. The result would not be correct!

2 Likes

It does shift the virtual cameras. If you want to check it, you can use hmdq (Release v2.1.1: Added a workaround for invalid FOV data (Quest 2) · risa2000/hmdq · GitHub) and check the output for either the eye pose (eye to head matrix) or the overall IPD. These values define the positions of the virtual cameras (the game then uses for the rendering) and change when you change either the software offset or the hardware setting.

7 Likes

I found one of your posts explaining this:
https://community.openmr.ai/t/pitool-ipd-offset-how-does-it-work/22344

However, it might be worth to have a look at the recent PiTools (266 or 1.0.2.87) and see if the behavior changed recently when introducing the individual H-offset and 0.1 steps.

And it does not work for me somehow. According to your post, I should be able to dial in IPD 61 and then compensate back to 64 by software. However, the results between IPD 64, offset 0 are different from IPD 61, offset -0.3 or -3 (I don’t know the scale of the offset number).

Even if it worked like that it is not clear to me if a offset of -1.0 in the older PiTool (where both images where linked in one slider), is it now R -0.5 and L -0.5 or both at -1 to achieve the same result with the new PiTool??

Another issue resulting from your post: If the offset and wheel do the same in respect to shifting the camera positions of the rendered image, there was no way to only correct a hardware misalignment of the displays without affecting other aspects of the optical path. It makes only sense if you assume that the calibration of the PImax headset is in general correct and you want to shift the lenses independent from the ipd while keeping the correct viewpoint. But we know Pimaxes are almost never correctly calibrated.

Whatever I do, the world scale in the 5k XR is the biggest issue for me. It is way larger compared to the Vive Pro no matter what I do (wheel or offset). I really want this headset to work and give me the same immersion as the Pro or Index!

I have high hopes on you to help me with this. I am slowly getting distressed about not being able to get it right!

1 Like

It sure does… however, it is impossible to get both, the IPD overlay and the PE screen to converge giving a subpar experience. If I adjust for optimum sharpness on virtual desktop distortions do get worse, if I trim for least distortion, the sharpness reduces. That would be where the “offset” should kick in but it doesn’t, at least not as hoped.

After months of tinkering, I got a good real-life view dialed in with the 5K XR! Most time I spend on trying to understand how the software layers work together and how every setting is influencing each other.

It gives an huge jump for the immersion, although the lenses are still a big part of not getting the perfect immersion. I conclude that there is no way to fix every issue to perfection with the current hardware. You either end up with strong side distortion, visible even in small FOV, or you fix that and have distortions towards up (ceiling) or ground level or increasing barrel distortions closer to the center or end up with a view into VR that is completely off from real world. You have to choose which is the best compromise for you.

The distance from the lenses (thickness of face pad) and angle on your head is also a big part of the equation. It even matters how much you tighten your head strap. Every time you wear it you need to put it on exactly to the same position, same angle and same tightness, otherwise your settings are already off again.

I used the thin pad and added double sided Velcro at certain positions in various layers. I bought a thick and thin Velcro to fine tune.

My IPD is 64 and I found the sweet spot at exactly 62.4 and H-Offset -0.5, V-offset -0.6. Changing the H-Offset by 0.1 or dial the wheel will to the next number (62 or 62.8) already reduces the feeling of presence!

In these months, I was sure that H-offset -0,9 or -0,7 could be right. I used the hardware IPD at 65 and at 61 and thought I am on the right track. However, when immersion really kicks in, all doubts are gone, and I use the new settings now for some days and compare it to the Vive Pro and Index. It is correct now! Every immersion breaking thing cannot be changed with this hardware anymore (light bending and distortion on the inner sides of the lenses, outer sides, upper and lower side). The center of the view at the size of Vive Pro total View is now a close to real life view as it can be.

I am also sure that you cannot compensate every IPD setting with a corresponding H-offset setting. It had to be 62.4 and -0.5 offset. There was no way to get the same immersion at e.g. IPD 61 with any other H-offset!

Now I have to do the same for the 8KX and hope that I get it dialed in quicker with this after all the experience I have now.

For all others who read this: You might think your settings are right, but you will not know before you compare it to other head sets or get it dialed in by accident or luck. It makes a huge difference and you know it once it is happening. You might have gotten used to the Pimax the way you use it, but the level of immersion can most probably be much higher!
I hope this post will help someone with his Pimax!

Now a wireless adapter is required!

3 Likes

Is the H-offset now working the opposite way?

When I checked the IPD offset impact (long time ago) the positive values increased the virtual camera distance (cameras’ pupillary distance). Using negative offset would mean making the virtual camera distance even smaller than the hardware IPD (which is already smaller than your real IPD). This goes against the intuition that in order to see the scene geometrically correct one should set the virtual cameras in a way they mimic his own eyes.

Thanks, that’s what I expected and is important to know. It would also be useful if we could shift just the virtual cameras in order to change “world scale” (some games like SkyrimVR have that). For me, when I have the IPD offset applied (pitool 260) so the image is sharp, it makes the the scale of everything look too big (my IPD is only 58). I have learned to get used to it, because it’s far better than having a blurry image.

Good luck with that. My 5k+ was easy to get setup, but I was unable to do the same with my 8kx because of it’s inherent design flaws.

It does not seem like!
When I apply an offset to the right eye by increasing the negative value, the image shifts outwards (to the right side). When I Increase the left eye negative value the left eye image also shifts outwards to the left. This increases the stereo separation of the two images I would conclude.
I only talk about using PiTool directly and not The PE software. Maybe they did a more logical approach in the PE.
And it clearly works.
Reducing negative value as you suggested gives me eye strain. At 0 value it is already clearely off and can not be fixed by any IPD applied by the Wheel. Going positive from there breaks the 3D completely and results in double vision.

Yes, that would be logical and the way how any other company would have implemented it. But not Pimax of course.
See also my answer to risa2000 for details.
I also only talk about PiTool 266 and some previous versions. I don’t know if it is inverted in PE or future versions of PiTool.

OK, I was on 260 before. I just updated to 266 and it is a big change. I can’t even use the ipd offset I had before, it’s behaving very differently. I was using -3.0 total with 260 and now It looks best at 0 or +0.1 per eye (horizontal offset).

The only proof is in numbers :wink:. Could you run hmdq on your config and post reported IPD from SteamVR? This way, you can also check the impact of changing the offset yourself.

2 Likes

Hello, Emax

We suggest you to log a ticket to our technician, they may support you some help. And please provide the ticket number you created, we will urge them to response on it .

Sincerely!

Ok, I give it a shot when I have some time (in a couple of days hopefully)!

There are already others on reddit reporting that the behaviour of the offset slider and even the IPD reported by the wheel give different result in PiTool 266. I hope however they dont change it again, since I could finally dial it in with PiTool 266.

Hello PimaxOliver,

what shall the ticket be targeting at? It is just how your Pimax Headsets work out of the box! I had 4 different ones (8KX, 5K XR, Artisan, 8K Plus) and with none of them the optical calibration resulted in a “real world view” when only dialing in my correct IPD and leave the H-Offset at default! This however always worked easy on a Vive, Index, Rift or Quest for example.

Pimax should maybe start to compile a technical documentation or provide tools to help your customers understand how the underlying software works and what to do to get the optics right! I could only find hints by other users so far!

You cannot solve my issues. Also, all my other issues have not been solved by Pimax (Tracking Judder with all Headsets with blue housing, my issues with the Artisan etc. I already solved it for me after months of tinkering and experimenting. I found the right optical setting for the XR, I bought LH 1.0 Base Stations that work now with the 8KX (many LH 2.0 do not, see also other threads in this forum about it), and I sold my blue housing Artisan as it was unusable to me (probably a LH 2.0 issue as well, found out too late). My 8KX requires a 2.5 V-Offset for only the right display to get the two displays in line! This is how well calibrated the hardware is and every headset is unique looks like.

By the way: every support should be rated by how many problems are solved and not how many tickets are opened. My feeling is Pimax only want to manage tickets, not really solve issues!