All the diagonal FOVs of Pimax 5k+

I would like to finish the series on all the different FOVs of Pimax headset (https://community.openmr.ai/t/all-the-different-fovs-of-pimax-5k/16053) (https://community.openmr.ai/t/all-the-parallel-fovs-of-pimax-5k/18700) by taking a closer look at the diagonal FOV.

First, I need to say that specifying diagonal FOV for the headset (without knowing the aspect ratio) is useless in an exactly same way as specifying a pixel count instead of the monitor resolution. If one knows the correct (display) aspect ratio one can calculate the (display) resolution from the pixel count, but in general it does not give much of the information.

For some reason however the marketing departments seem to be loving it especially when they fail their math and can claim the arbitrary numbers. Luckily, it is easy to verify the marketing claims, once one gets the headset and does some calculations.

The FOV calculation model is defined by the following picture:

The view is drawn in scale to correspond to the total stereo Normal FOV of Pimax 5k+ (but it should not differ much from 8K). Total stereo means assuming both eyes, and Normal FOV is the PiTool config option. I chose Normal FOV, because the Large FOV would get cluttered (it is too wide) and Small FOV has not very big difference between the horizontal and the vertical FOV. The picture however serves just as an illustration of the model. The calculation presented below is equally valid for any type of FOV.

The total stereo FOVs corresponding to the different configurations for Pimax 5k+ are (taken from https://community.openmr.ai/t/all-the-different-fovs-of-pimax-5k/16053):

Small FOV   horizontal = 120,28°  vertical = 103,56°
Normal FOV  horizontal = 140,28°  vertical = 103,56°
Large FOV   horizontal = 160,28°  vertical = 103,56°

For those interested I include the calculation here, otherwise just skip to the results.

Calculation

Let’s define the points in the model with the following coordinates

bottom, left  -> BL = [tan_left, tan_bottom, 1]
bottom, right -> BR = [tan_right, tan_bottom, 1]
top, left     -> TL = [tan_left, tan_top, 1]
top, right    -> TR = [tan_right, tan_top, 1]

:where:

tan_left = -atan(FOV_horiz/2), tan_right = atan(FOV_horiz/2)
tan_bottom = -atan(FOV_vert/2),  tan_top = atan(FOV_vert/2)

:where:

FOV_horiz is corresponding (total stereo) horizontal FOV
FOV_vert  is corresponding (total stereo) vertical FOV

:further:

O = [0, 0, 0] (system and view origin)

Using the Euclidean geometry we can calculate the diagonal FOV as an angle BL-O-TR:

FOV_diag = acos( dot(BL,TR) / (|BL|*|TR|) )

:where:

dot(BL,TR) is the dot product of the vectors BL and TR
|X| is the length of the vector X

Results

Small FOV   horizontal = 120,28°  vertical = 103,56°  diagonal = 130,23°
Normal FOV  horizontal = 140,28°  vertical = 103,56°  diagonal = 143,66°
Large FOV   horizontal = 160,28°  vertical = 103,56°  diagonal = 160,74°
9 Likes

If your screen is flat and if my Pythagorean knowledge serves me well you’re little short on the diagonal FOV on the last one 160,74° should be near to 190.82 no?

1 Like

I am not aware of any 3D model for FOV calculation, which can use the Pythagorean theorem. The model I used is described above and the results are calculated accordingly.

If you could describe yours, I would comment on it then.

It’s not correct mathematically to do it like I did but 190 degrees is closer to the advertised value by Pimax and make sens vs the 180 degrees advertised by XTAL.

The field of view is define by a solid angle inscribe under the spherical virtual screen. I will think about this…

But don’t you feel something is wrong if your H_FOV=D_FOV?

Large FOV   horizontal = 160,28°  vertical = 103,56°  diagonal = 160,74°
2 Likes

Values advertised by Pimax for diagonal FOV were never explained, even when I asked them to, and just repeated over and over. I was not able to come up with any mathematical model (planar or spherical) which can produce those values.

1 Like

They are not the same, diagonal is bigger. The theory behind the calculation is correct, so unless I made some other mistake the figures are correct too.

FWIW, when I draw the illustration (included in the OP), I also “measured” the diagonal FOV in the drawing and it corresponded to the calculated value (~143° for Normal FOV).

1 Like

Including how they figured them out. Sure it’s some kind of witchcraft math.

For example if you take the screens forming a triangle & place 2 parallel lines one at apex & the other above the base of the triangle the lines might hit 100° left & right sides.

Now of course this might be ridiculous much like Dynamic contrast. But par for the course with marketing bigger numbers.

What we need is a clear standard for vr hmd makers to adhere to with spec reporting. Save a lot in having to examine each one directly to find real truths.

But let’s put an engineer on the spot. @pimaxusa can you explain how pimax calculates 200 diagonal FoV when the Actual perceived usable (rendered) horizontal & vertical FoV doesn’t seem to support advertised spec?

When i tested mine i get around 120-125 horizontal on small and 150 horizontal on normal fov.

2 Likes

When the application is instructed (via OpenVR, which in turn is instructed by the headset) how to render the scene projection with a specific FOV, it cannot render it differently (with the different FOV), and you cannot observe different output than what was originally passed onto the app, by the OpenVR. It would be like giving a 1m x 1m canvas to a painter and then, after he will be done, observer a picture which is around 1-1.5 m wide and 1-1.5 m high. It simply cannot happen.

I know it may sound confusing, but imagine that in order to render the scene the application must know precisely what are the projection parameters (one of which is also the FOV). It cannot guess, or render more or less for better luck, because if it will it will screw the angular verity (perspective projection correlation) of the scene.

2 Likes

Agreed. Why maybe our former NASA employee can explain the voodoo math used. :smirk:

1 Like

Hmm interesting. But as far as I understand it looking at your illustration, you are basically measuring the FOV from one single point of view? I though, measuring the total field of view with math, could only be made by measuring one eye at a time as your perspective is different for each eye due to the distance between your eyes. And then somehow calculated together the both fov numbers with some voodoo math magic :slight_smile:

To be honest I have no clue, but I know I measured somewhere around 165 in RoV horizontal FOV test last time i tried. But then I have an high IPD of 69.5 which could affect it negatively.

1 Like

Reading the title I thought

Pimax have changed the FOV every batch of 5K+

:sweat::roll_eyes::no_mouth::grinning::blush:

There are few things which might not be that intuitive about the 3D angles (which the FOVs actually are). One for example is that it does not matter whether you consider two eyes or only one eye as long as the FOV is the same. Now you may counter argue “but hey my eyes are few centimeters apart, so I should see more”, but in reality you are not, it is just that your eyes are not at the FOV center point, but slightly in front of it (so they can both fit into the viewing frustum).

I other words, when calculating FOV, it does not matter if you use one single point, or you put two eyes inside the (same) viewing frustum, because the two eye wont change the angular property of the frustum, all the angles (horizontal and vertical) will look the same. (I used simple model here, to not confuse people even more and to make it more clear, what each different FOVs are - horizontal, vertical and diagonal).

If you “split” the view center (the single point) into two eyes and move them apart, it won’t change the angles at all, because the split will be just a moving the sides of the view “further apart” but would not change the angels those sides keep to each other. (Assuming the eye will be looking into same direction as previous single eye).

There is one thing about the RoV test, which I believe might be wrong, that it actually paints the angle marks from the central point (the head coordinate) and not the eye coordinate, and then you would actually be able to see a mark originally intended for the 165° in 160° FOV simply because your eyes are few centimeters apart and it adds some volume on each side of the FOV compared to the single point “head” FOV. But again, while it “adds the volume” it does not change the angle, so if there will be something really far away and 165° apart, at certain distance this will eventually “fall off” of your FOV.

So, again, in this case it would not mean that you really observe 165° FOV it is just that the marks originally intended to mark 165° fall into your 160° FOV because of that. Or because your head/eyes are leaning back and therefore also increasing the area you can see at the close distance.

I guess this would be best addressed by the author of the RoV test @knob2001 (or @oscar_rov?).

Anyway, I can prove that the application cannot render more (or less) than what it is instructed to render by OpenVR (which is instructed by the headset itself), and I can also prove that you cannot see what the application has not rendered ;).

2 Likes