8KX High Res/FOV is Moot if the 3D Depth is Lacking

That seems really dishonest thing to do to show the product and say it will have these new lenses to everyone in the forum and then make a swap to inferior lenses.

I can only comment on 5k+ and 3D effect and world scale is pretty good.

In the 3D-vision days the 3D effect was mostly controlled by separation and convergence parameters. Separation you can influence by changing IPD, convergence you probably can’t change unless application supports it. Scale can be also changed by application so wrong scale is not necessarily Pimax fault (but perception of scale depends also on the separation/convergence settings -> for example if you have eyes of giant then everything looks smaller).

1 Like

There was no non freshnel lenses. These would have been iterations.

I get higher 3D Depth by lowering the mechanical IPD to the minimum, I have 64, (this way the stereo overlap is higher) and then I correct it with the software IPD. The picture is not so crisp, but if the stereo overlap is low it feels like a flat monitor in-front of your eyes

4 Likes

Are you sure? @Pimel seems to think the m2 had non fresnel lenses

There are two things about the lens design on Pimax, related to the IPD, which make it very delicate matter.

  1. The canted panels and the lenses position, where the lenses are placed at the optical axis of the view (perpendicular to the panel), which means the lenses are canted.

  2. Because of this design the lens centers are offset outwards (https://community.openmr.ai/t/clarifying-near-ipd-x-distant-ipd-confusion/14809).

This in general confused people into thinking that their IPD is not set correctly because when they measured the distance between the lenses centers it was much bigger than their IPD.
This is however expected behavior implied by the design.

Another confused raised from the fact that because of this design, when looking ahead, one does not look through the lens (optical) center, but slightly (or more slightly) off the center, which, with the given Fresnel technology, gives a slightly (or more slightly) blurred image.

Plus, what it worse it gives an inconsistent blurriness in each eye when looking around (by moving your eyes, not head), because when the eyes are turning to one side, the perceived blurriness in eye closer to the side one is looking at improves, while gets even worse in the “farer” eye.

The other consequence of the design is that the IPD settings depends on the depth of the eyes. This is because of the canted topology and the fact that moving the eyes further or closer to the lenses changes their relative position to the lens/panel axis.

People were trying to compensate the blurriness incurred by the lens offset by lowering the IPD setting, to the point they were looking through the lens center when looking ahead. This improved clarity and totally screwed up the geometric verity because everything was rendered for much smaller IPD.

Software IPD

Now, with software IPD, one can change the IPD (at which the scene is rendered), while not moving the lenses. This does not really make sense, except one specific situation, when one has already set the hardware IPD to the wrong value (e.g. in an attempt to remove the blurriness). In this case the software IPD is actually an option to remedy the situation by compensating for the hardware IPD with the inverse software IPD offset, thus allowing the proper IPD geometry, while still having the lenses at the (possibly) more comfortable spot.

For example. My IPD is 68, but I set the hardware IPD to 60 because I cannot stand the blurriness of the correct configuration. In that case I can compensate by setting the software IPD to +8, which would put the rendered IPD back at 68, but will allow me keeping the lenses at the offset spot.

This could technically disturb the distortion profile, because the lenses are no longer at the same relative position to the panel for which the distortion was calculated (or Pimax can calculate the distortion on the fly, in which case, there is no problem), but this is the price you possibly pay, considering the other options are not acceptable.

Considering that the correct IPD setting depends on the eye depth and that Pimax could only set one distortion profile for the product, there is already a potential for misfits. Adding the fact that the distortion profile is actually distorted (as mentioned by @RobCram) and the fact that people are typically confused by the blurriness and lens distance mismatch, there are already enough factors to basically guarantee that the regular user must be lucky to get it right.

Those who were less lucky end up here.

17 Likes

I have exactly the same feeling when I fly in Elite, which technically should be showing kilometer long distances, but it looks more like having a very close look at the miniature.

I can see the fish bowl effect in the center when I start changing the IPD to the smaller values than mine is (72 mm), which is also the max value I can set on my P5k+. By making the IPD smaller it is getting more pronounced. The easiest test is with the static image Pimax shows before starting SteamVR.

2 Likes

Yep I used the V2 in immersed 2017 (not m2 that was at the berlin meet in 2018).

Any pics of the pimax & starvr headsets you can see evidence of the freshnel fried egg topography. With the Xtal it looks clearer as Aspherical lenses do.

Now that being said can I be 100% no as I wasn"t the event pimel was & would need to have personally seen said lens samples.

2 Likes

I would love to see the 8kx screens with the XTal lenses lol

2 Likes

( Risa has already covered and explained the important bits infinitely better than ever I could, but I’ll spew some babble anyway… :stuck_out_tongue: )

The 8k/5k fares relatively well on the binocular overlap account, actually, being very similar to that of the Index – the Vive would be the “winner” among common headsets, on this aspect, beating the index and pimax by 6-ish degrees.

What limits stereo overlap, with the HMD construction scheme that has so far been most common in use, is the fact that the left and right hand screens bump up against one another, and can’t go any farther, so that’s where the view ends in the anti-lateral direction (there is a multitude of potential solutions, but I digress). :7

No, that’s not really the problem, as such, unless one view it in a very localised way, which will hopefully become clear in the next few paragraphs (…and which is quite likely what you actually mean, by what you write).

The index glare is not so much bad despite the two-element lenses, as on the contrary, that the glare is not worse-than-it-is, despite them. :7

You’ve got twice the surface area, of “walls”, between “collapsed” sections of lens, which can scatter, reflect, refract, diffract, and conduct light, so one might have expected it would be not only twice as bad as the Vive, but pretty much squared… :7

The “yellow glare” I mention is also a very specific thing, which I have not seen in any other headset than my 5k+; It looks kind of like one of those country flags that consist of cross; Black cross on yellow background, in this case; I get the impression it is either edge bleed from the backlight, or just the usual polarization-affected light bleed from when you do not look perpendicular at an LCD screen, simultaneously concentrated and scattered by the frenel lenses.

There will be no sentimentality from me either, the day everybody stops using Fresnels – leave them up in the lighthouses, where they belong. :7

…so what I was specifically aiming at, in the post I had made, and which you quoted, was not directly stereo overlap as a whole, nor directly screen canting, but the matter of the field curvature of the lenses in the HMD, which makes it so that only a tiny spot in the centre of each lens is in focus (often erroneously referred to as “sweet spot”; Please stop it, anyone who does that, and reads this - the sweet spot is something else, and we do not need the ambiguity. :P).

The problem with only such a small a portion of the view being in focus is exacerbated by the matter of canted lenses, since them facing out in different directions, makes them overlap less, which means even less of the stereo vision part of the view is sharp (even right while it can kind of give you the impression that the total width of the sharp area is them both combined, making it “feel” larger to you than it really is – you had something similar with the Rift CV1), and as you turn your eyes, one eye will get better into focus, at the same time as the other moves out of focus; But I consider these matters symptoms, rather than the root of the problem.

As Risa explained: The projection rendered by the game is set up to match the spatial relationship between the game camera, and its viewplane, to the optically optimal spatial relationship between the user’s eye, and the portion of the screen that the view will be drawn on, with some consideration being given to the effects of the lens, and this is also the axis around which lens distortion happens, and around which compensating software distortion is applied.

(Rendering canted (EDIT2: …and displaying, too), is very efficient, compared to extending the FOV with parallel viewplanes, for reasons illustrated by simple trigonometry.)

This, as he said, means there is an optimal positioning of one’s eye, in relationship to the lens, in order for the virtual- and real-world projections to line up, and everything having geometric verity. It also means that when you cant the two optical trains: Changing the distance between your face and the HMD, moves you not only on the Z axis for lining up the two projections, but the X axis as well, so if for any reason you just. can. not. achieve this optimal position; To at least match on on the X axis (where being off would skew the projection), and sacrifice only the Z (which you may already be doing with other, non-canted, headsets, by having too much or too little eye relief), you need to increase lens spacing, if there is too much room between your face and the HMD, and decrease it, if you are too close.

…so as he also suggested: What many have been doing to “fix” the problem, is to reduce lens spacing, until the two little spots in focus for each eye line up for them, when converging on something in game that feels to them a “typical” distance away (hence all the nonsense about “near IPD”), which does maximise the tiny area of simultaneous focus for both eyes, but inherently both totally skews the lining up between the projections, and means there is NO direction they can look, where they are looking perfectly along the axis of either lens, to receive optimal optical conditions (EDIT3: …including lining up with the… umm… “fault lines” between Fresnel lens segments, for least visibility).

Now: I could be wrong, but as far as I can reason, the primary reason for the two-element lens in the Index, is to “flatten the field”, so that the focal distance of the lens increases, away from its central axis, when seen from the position of the eye, following the distance to the screen as closely as possible, across its entire area, and not just that tiny spot in the centre.

THIS is what we need; We need the entire view to be in focus, so that instead of a blurry mashmash, just a degree away from the centre, there is contrast that helps you make out stereo parallax, across the entire view - not just inside that tiny radius in the centre (unless all you ever play is racing games, where your eyes are glued to the road, far up ahead. :7)
(EDIT: If we have that, anything (such as lens canting) moving the centre of the two views apart, will no longer be a significant problem.)

8 Likes

The missing depth impression can also be that the border of the field of view is too far from the stereo overlap. The border works as a two-dimensional reference plane for the depth impression (like the frame of a 3d TV (=popin/popout)). Therefore: the smaller the fov, the clearer the stereoscopic depth impression. My thesis could be tested with xtal or HMD´s with a similar FOV.

Sorry for my english :smiley:

2 Likes

It’s possible that there is an issue with ED itself.

A few years ago, I was playing ED using a 3D “nVidia Vision” monitor. Initially, it looked like the stars were about 20 feet (6m) away. After a bunch of tweaking, I was able to move the stars further back, so that optically, the stars were 40-50 feet (12-15m) away and the planets finally looked like they were closer than the stars.

I see pretty decent depth in ED using my 8K. Next time I play, I’ll try to remember to look at planet vs star depth on my 8K.

4 Likes

What thing did you do that gave you better 3d?

1 Like

In short:

Set software IPD to -2
Dial hardware IPD back to your measured IPD.

3 Likes

I tried -1 earlier. It did improve the 3D effect, but it also caused eye strain. I think it also slightly reduced the SDE. In Elite D, the ship surroundings felt closer and the stars seemed further away. Previously, I always felt like my arms were too skinny and my hands and the console were too far away. Reducing the software IPD fixed that scaling issue.

I played for a little over an hour and I still feel the eyestrain, but it was a definite improvement in-game. I think I’ll try reducing the hardware IPD tomorrow and see if I can get the scale improvement without any eyestrain.

4 Likes

Weird, the 3d all looks the same to me. Pimax or vive.

1 Like

Well, the Vive is not exactly known for edge-to-edge sharpness, either. :7

1 Like

Seems to differ person to person

1 Like

Neal, do you have another VR headset to do a direct comparison. It’s a pain to setup but worth it.

2 Likes

No, I don’t. The 8K was the first headset that I felt met my minimum requirements. I don’t plan on buying another headset for a while and when I do, it will probably be the 8KX. None of my friends have a VR setup either (and they aren’t particularly interested). My best friend wasn’t impressed by the 8K, mostly because he’s far-sighted and yet, he doesn’t wear reading glasses (so everything was a blur).

2 Likes