Yes. What is more unfortunate that if someone tries to apply the old “best practice”, he will most likely make things worse.
The major consequence is that the user can no longer compensate the “wrong” lens position (in cases when the wrong lens position feels more comfortable) with the software IPD offset. Now (as I wrote above), the IPD for virtual cameras is determined by (and only by) the hardware setting. So once you do not set your real IPD the scene will be rendered with the wrong virtual camera positions.
Good question. I have no idea. As it shifts the image across the display it could be anything Pimax devised as convenient. The way to figure it out would be to try to capture the final (post warp) image from Pimax pipeline and check the image shift. But this will only work, if the shift is done in the compositor and not on the headset itself. I guess I am not going spend my time on that for obvious reason.
In general there are two fundamental problems in the new implementation:
-
It removes the possibility for the user to set up the geometry right (in cases the “correct” IPD setting is not comfortable and he needs to use different lens positions). In other words, whenever you set the hardware IPD to anything else than your real IPD, you are screwing up the rendering geometry for yourself.
-
It shifts the image across the screen but does not change the warping, which means with visible shift, it will also introduce visible distortion, because the (pre-lens) warping will no longer match the image position on the screen.