Your Real IPD with Pimax HMD after a lot of work. Future of eyetracking

I don’t want to get too long on this, so I’m going to explain it quickly. And excuse my English.

As a developer I have been working a lot on the subject of lenses in all VR glasses. And I have been very interested in varifocal technology (which has points related to what I am going to explain).

Contrary to what many believe, people do not have a single IPD. We have a range. This generally goes between NEAR IPD (reading a book) and FAR IPD (seeing a landscape). Among them we have a full range of IPDs, like the Computer IPD (closer to near than to far). Optometrists usually only give the maximum range, but not the intermediate ones.

The IPD is totally related to the real sense of depth and presence. For example, if we use the NEAR IPD in VR, we will see close objects with the depth that they have in the real world, but we will see they flat and closer to what they should be.

If we use the FAR IPD (as is usually done) we will see the nearby objects flatter than they should be and somewhat distant, but we will see the three-dimensional backgrounds and in place.

Some professionals start using VR IPD. This is located 3 meters in front of the user, since 90% of VR games require room escape with spaces that make you fixate on targets at that distance.

For example, my IPDs are:
FAR IPD: 68.5mm (inifinite)
NEAR IPD: 66.2mm (30cm)
VR IPD: 67.5mm (2-3 meters)

Once you feel that you really have your IPD well configured you feel the depth of the 3D just like in the real world, but only in that distance (far/near). That is why I use different IPDs depending on the game. For open world games I use 68.5mm. For games where I manipulate objects all the time I use 66.2mm. And for generic games I use 67.5mm.

The eyes move between these IPDs to simulate depth in the real world, which is not the case in VR.

However an eyetracking system could easily do a software adjustment of the image to achieve something similar to what is achieved with varifocal technology (only without distance blur). But the depth effect with eye-tracking glasses with IPD depth correction would be abysmal compared to current technology.

Really 80% of VR gamers play their games semi-flat. Until now it did not matter because the SDE hid that feeling. But the arrival of HMDs without SDE and high pixel density is showing this deficiency.

Until eyetracking solves it, I advise you to measure your FAR / NEAR IPD and play with it according to each game. When you get the first setup right, you can’t believe you’ve been playing in semi-flat VR before.

@PimaxVR @PimaxUSA hope you can add an option in Pitool to use eyetracking and adjust distance between far/near IPD. I think it is a bit difficult, but depth information can be extracted comparing pixels between eyes.

22 Likes

This is an interesting post and I believe it is usefull for increasing vr-users understanding of IPD. The concept of changing IPD to suit the game one is currently playing has not occured to me.

When you write that vr gamers play their games semi-flat: Can you tell how sde (and perhaps resolution) hid this feeling and what the reason is that higher pixel density and less SDE is now showing the deficiency? This is anecdotal, but I got into VR with the oculus dk2. My memories of using the dk2 was that I had a stronger sense of presence and a memory of the 3d/depth being more pronounced than what I have had with newer headsets (hp reverb pro, pimax and valve index). I’ve thought this has had more to do with adaption to the technology, the nature of canted displays perhaps rather than pixel density and lack of SDE.

It does make sense though. If everything in the distance is blurry as it was in the dk2, you would probably use the near ipd as I did because it was mostly objects close to you where you would notice the difference.

I’m excited to see where the discussion in this post leads!

2 Likes

Yes!! I know what are you talking about hahah when I had the DK2 I had the same feelings.

Now with Pimax and Oculus Quest 2 I had to “tweak” it but I finished having a real 3d depth. Sometimes I need to do it carefully, and accurately, but the difference in only 1% is absolutely incredible.

I think that as better screens we have, we need to adjust it better to have a real immersion.

1 Like

Idk what wrong with my IPD settings, I play since 2016, when I first tried PSVR in a shop was blown away, then ordered CV1 & was blown away again with that robo demo, after years I have 3d depth for some degree but I would call it “semi-flat” indeed, brain adapts, I dial in Quest IPD to max & got better 3d but worse clarity but it worked partially then get used again, usually in close distance you see 3d but when looking into distance it gets semi flat no matter what IPD I set, yes in some settings it gets better, but it’s not full 3d, it’s kinda semi flat, although Index & Vive/Vive Pro + in a bit less degree Pimax 5k+ gave me much better 3D depth than Oculus products, I also have other HMDs like O+, rift S they look similar to others in terms of 3D depth.

But first VR experience is amazing, I remember that $hitty tank game on PSVR you look behind & wow, chair, cockpit all is so real, you can feel the depth so deep, but eventually it wears off to some degree, I even experimented with real object & HMD settings to get similar left/right eyes image differences looking at similar box object at the same distance & angle in VR it gets things better when you dial IP & HMD placement properly but not the same level as when you tried it for the first time. I think when I have longer breaks from VR I feel 3D better, but it’s just a hypothesis. When I focus on far objects closer gets out of focus & vice versa, images miisaligment is ok but there is no different light source distances, you also seems can train this stereo vision effect by looking at close & far objects & trying to imaging big space in you imagination & it also can make things better. But usually iit’s just semi-flat experience nowadays for me

3 Likes

I know what are you talking about too. For me is the same. Even now, that I adjust with 3D printed pieces my Quest 2 and my Pimax with software/hardware IPD… I only can have 3D depth in a few moments, or distances.

The problem is when I have this sensation. Because when I didn’t know this level of immersion I just played. But know my brain is always searching it. In special because I can fell my eyes more comfortable in this moment, and I want to have this sensation every time.

I think this is the last wall in VR. We don’t have SDE now, we have better fov, better hz, better tracking… but the last limit is the real 3D depth sensation that only is possible with:
-IPD automatic adjustment based in distance.
-Varifocal lenses for natural distance blur.

For me IPD automatic is more important than distance blur. I hope eye tracking can solve it soon…

1 Like

Yes things get developed rapidly, I got Vive Pro Eye & it works quite good apart of the it has no support in most application, I also waited new Oculus HMD announce & hoped it will finally got varifocal lenses as they already had working prototypes & it’s the most anticipated feature I wait for + good software implementation for eye tracking + performance hardware optimizations like VRS as well, it’s key element for varifocal lenses & Vive, now Pimax & also HP Omnicept, they have legal framework & other stuff to finish but once done it will be much easier to address the rest coz without gaze data it’s impossible

3 Likes

Could you explain, what exactly do you mean by “IPD”? It seems you are confusing different topics together:

Why there is a “near IPD” in the first place? Because of the optics. When people are reading their eyes are typically focusing on something which is fairly in front of them and relatively close. Which means the pupils converging on the object being read get closer. This makes the opticians put the lenses in prescription glasses closer to better use them, because the lens has typically the best optical properties along its optical axis. This and fact that the object (which is read) does not move, makes this “optimization” justified.

Once you start using the glasses to look at things all over the FOV (typical for shortsighted people), there is no “better spot” to center the lens with than the eye axis, because if you move the lens to the nose, then when looking aside the degradation of the optical property will be worse, than when you just center the lens on the eye.

So there are like two different usages - reading x everything else. It does make sense to optimize the lens distance in the glasses for reading, it does not make sense to do it for everything else. In other words, putting the lens closer makes only sense if the object is close and right in front of the observer. If you put it close, but to one or the other side (of the view), you lose the advantage of having the lenses closer.

A VR headset is not glasses for reading, even if it is “focused” at the constant distance. You still look all around and need the best lens performance all around.
Note: Here I am not talking about Pimax headset in particular, because it has a few problems on its own in this department, but about any other “generic” headset with parallel lenses and views.

Now there is another aspect of IPD (in VR system), completely unrelated to the lens performance, which defines the distance of the virtual eye cameras used to render the scene. This definitely has an impact on the depth perception, because we are each trained to our IPD and the brain reconstructs the 3D perception thanks to that.

But for this aspect to work correctly, it is more important that the perceived image is correctly rendered in the complete FOV, not just for one particular eye position (and orientation), what you seem to be suggesting (but I may just misunderstand your post).

There are some specific problems one faces in VR because of the pupil swim (and potentially different eye relief), but I do not think they are the problems you are trying to solve and they are rather subtle, compared to what you propose.

Do you have a reference to share about the professionals who use “VR IPD”?
I do not think that because 3 meters is the most used distance in VR for the objects to be in, we should use a different (smaller) IPD for that. The reason is that the objects can be anywhere in the FOV, and not strictly in its center.

4 Likes

I believe this is the main thrust of what they was getting at, despite all the mentioning of lenses, which are somewhat more tangentially involved (I think they intimated as much, but am not sure): The slight parallax shift you get from varying degrees of convergence (which is of course an analog property, if anybody got snowed in on the labelling of discrete levels). I recall watching some Siggraph presentation about the difference in projection verity (EDIT: …when updating the camera position every frame, to match the user’s gaze), while subtle, not being at all negligible.

(EDIT: Let me make this easily tried example, if any reader wishes more clarification: If I close one eye, and hold up a finger right in front of the remaining open one, the finger will block my view straight ahead. However; If I swivel my eyeball, to look out to either side of the finger, the pupil of my eye will have moved far enough to the side, that I can now see some of what the finger previously occluded, from around its edge. This does not happen in VR, because the camera remains right behind the finger at all times, regardless of where I am looking.)

Doc Ok made a video some time in the past, that first demonstrates the effect on a fixed camera projection, when looking around, but then, also, at the suggestion of a commenter, how decent a universal approximation it is, distortion-wise, to make the projection from the centre of the eyeball (…which I guess everybody are doing), instead of the centre of the eye’s lens.

/me never heard of “VR IPD”, either. :7

3 Likes

Of course, here you have.

And well, for professionals… I’m actually Epic Official Instructor and I work together with Unreal programmers in the VR department. We always work with VR IPD and not far/near.

iPhone also has 3 apps that give you VR IPD and not only far and near, using frontal camera scanner.

3 Likes

And here you have one screen capture of one of the apps we use. This give the VR IPD for open worlds games.

3 Likes

We are talking too in Reddit about this if you enter in virtualreality sub or oculusquest sub.

2 Likes

Oh! If you want to know why I talked about it. It was because actually I have a conversation with John Carmack to have access via adb to the IPD software on Quest 2 to change it slightly inside the Unreal Engine when a user is looking to a near object. It could be a intermediate solution we want apply in unreal engine.

But Carmack said to me that actually Oculus SDK doesn’t give control over it for VR IPD accuracy inside the app. Then it only use far IPD.

Hope in the future they can add this option. Actually we are searching the way to simulate it inside the engine.

We want to talk about it with Pimax too. But I’m the only engineer using 8Kx in this moment. But I want to try it.

4 Likes

You meant something like this: Computational Imaging?
This is what I had on mind in my reply, but this will really be only addressable by dynamically adjusted eye pose in runtime, not by arbitrary (but fixed) change to the IPD, which is just a scalar and cannot even describe the full position of the eye. Plus it may as well put the eye on the “other side” from the center, depending on where one is looking.

Interesting, I guess I should be able to find it somewhere, on youtube?

1 Like

What exactly did you want to change? You can render the scene with whatever eye pose you choose, you do not need to use one supplied by Oculus API.

1 Like

When you use Oculus SDK the image is fixed to internal code specifications. That the problem. Because it changes automatically when it is opened in each Oculus hardware and changed dynamically outside the app code when user changes the IPD in the hardware.

1 Like

Looks like it.

Exactly. Eyetracking is needed, as would be, preferrably, API standardised eye pose updates for every frame.

The developer is of course free to move the cameras around to suit any whims on their end, unrestricted by the hosting VR runtime, but just adjusting the fixed IPD for a scenario, would be to make the assumption that the user will only ever look at an object that is right in front of them, and at a certain distance. Were one to really double down on the former, one could update the latter with a raycast right down the centre of the view, but it would be unfortunate to do that doubling down in the first place. :7

Umm… Should be on his channel, albeit old and outdated, and not showing anything you don’t already know. :7

EDIT: This seems to be it: https://www.youtube.com/watch?v=SC9vbV-zjBk

1 Like

Hi Yen - is this an iOS app? Could you share the name, or the name of another providing VR suggested IPD? I’m interested in experimenting with it.

You may want to watch the videos from DocOk first:

to understand the problem, before trying to apply the solution.

5 Likes

well, guys…fancy words aside and in short. what to adjust first, second, third etc please?
should i adjust hardware ipd first? if yes to what value if my far ipd is 63mm?
should i adjust software ipd first?
should i go with near ipd (58mm)?
can any of you give me some sort of guide PLEASE?
because all that stuff is so CONFUSING!
thank you!

2 Likes

Eh seems like too much work and not worth the effort. We’re better off waiting for that new technology that Google showed off. It’s an alternative to stereoscopic 3D and it’s very convincing. I forgot what it’s called.