I need to use differents IPD in each game. Do You tried it?

I notice something with my Pimax 8K. More depth in comparaison with my other headset.

It’s like a “aperture” in photography. When you have high aperture number, everything is in focus, but if it is low… then you have a lot of depth.

Then… in some games I have IPD 62, in others 65, in budget cut I have 70.

In Budget Cuts, with 70, the space is absolutely realistic. I close my eyes… thing in my real room, open again in the game, and see that it’s a real measure of distances. But… due to this big IPD, I can’t read near letters.

Then… I understand why Oculus, with the half dome, and 140 of FOV, need the Varifocal Technology.

If you stare to something near, your eyes take close. And if you see something very far, the IPD is the biggest you can reach.

image

@anon74848233 and @PimaxVR team… I think that you can use the eyetracking to make the same.

With the wheel you can adjust for the normal IPD. It’s when you stare to an object that is at 40 cm or 16 inches. And then… when you stare in virtual reality at something very close, pitool can change the IPD vía software in a quantity (like -5), and if you state to something very far… then your parallel vision and bigger IPD.

I don’t know if pitool can take depth information from the video games to do that… but I think it is a very big improvement to achieve the same that Oculus want.

Thanks for your time :blush:

5 Likes

When I watch object closed to eyes, always see black edge of lenses and the middle.

Sorry @anon74848233 and @PimaxVR team. It is easy. You don’t need the depth information, because the eyes can give it to the software.

You can create a software that make a measure when the user stare to a object in a distance. And then… if the eyes take close 10mm between themselves, because the user try to read a letter, then you can adjust IPD via software :slight_smile:

I’m talking… and it’s possible that you know all of this and only think in eyetracking with this principal function and to correct distorsion haha. Sorry.

7 Likes

@Sean.Huang
@Alan.sun
@Doman.Chen
This requirement need to be noticed

5 Likes

When I use more thicker pad, I can see more distortion at center. I think depth distance still be a factor to make people looking to the lenses with different angle.

if you could just compensate IPD by software then why messing around with mechanical movable lenses in the first place
also people at valve and oculus will know that and have not compensate by software in years - there might be a reason?

1 Like

Technically, you should be able to adjust the rendering position of the image on the panel, but without moving physically the lenses is it really worth it?

1 Like

its not just shifting the picture on the display, if the eye is not in the center of the lense you would need a different lens correction profile because if you look “sideway” through the lens it will be thicker on one side and thinner on the other so the distortion will be different, kind of similar why people say you would need eye tracking for (dynamically) correcting the outer area of the wide field of view headset