I would like to understand what happens in Software when using the Hardware Wheel vs the Software IPD Slider in newest PiTool Versions (1.0.2.86 or 1.0.1.266 and above).
My current understanding and observations are:
The software IPD moves the two images further apart (increasing minus values (or decreasing positive values??)) or closer together (decreasing minus values (or increasing positive values??)). It has a strong effect on the stereo separation and world scale perception.
Moreover, virtual objects and objects in real life should be in sync in respect to accommodation of the eyes to a focal plane of same distance/range. Minus value of H-offset increases the depth perception.
The hardware IPD wheel moves the lenses physically and the Image seems to jump whenever the overlay of the IPD appears. I guess the image follows the lenses as the screens itself are not moved physically so it is implemented by software.
From this you could conclude that the Soft IPD and Hardware IPD do the same manipulation to the image, the wheel just moves the lenses in addition to the image on screen.
But that is not what I observe. The wheel does not have any significant influence on the stereo separation or world scale when e.g. moving from 61 to 65 IPD. I remember this was different in earlier PiTool versions (not sure on this)
However, it does have an impact on the distortion profile/perception. The edge distortion is significantly reduced at IPD 61. from 62,5 onwards it is clearly visible and increasing towards 65. My real IPD is 64.
So, I would like to use IPD 61 in hardware as it eliminates distortion. Then I use around -0.7 H-offset to get the correct convergence accommodation described above (in Sync with real life).
Unfortunately, e.g. -0.9 or -0.6 also look fine and in Sync, but the values have significant impact to the world perception in VR and I am stuck here, as every value does not seem perfectly right. Mostly the world scale seems too big.
When you use the head set for some minutes, every value seems right (-1.5 or -0.3) as long as it does not hurt the eyes. The brain adapts quite fast to the settings and I need to wait some hours or a day to continue my “optical calibration”.
When I use the Vive Pro or Index, everything feels right in the first second when you put it on. With the Pimax XR it always feels little bit odd in the first seconds to minutes and I try the next offset value!
And most important: when I use IPD 64 on the wheel and H-offset 0, it is totally off and definitely not correct.
Can anyone help me with this? I am trying this since months and don’t play any games since I am a Pimax user!
I read everything I could find on the internet about this but have not yet a final solution for my problem.
It would be nice if someone can explain in depth the function of the PiTool software and what it really does to the image. Does the software IPD e.g. only shift the rendered images or does it shift the position of the camera for rendering the image. The result would be slightly but significantly different.