Wish a set of 170 FOV lenses that could use full pixels of the panel

When switching from 200 FOV to 170 FOV, is part of the panel just blocked, and the effective pixels are decreased? If so, 200 FOV and 170 FOV will have the same pixel density and the same perceived clarity, with the only difference being the FOV.

Ideally I most anticipate 170 FOV on a hardware base. That is to redesign the lenses to fully use all the pixels of the panel within 170 FOV. That will give us a significant boost in pixel density and perceived clarity, while avoiding the distortion at the edge. Of course that would involve too much work, but it could be a future option to consider. In fact I always think that should have been the best option from the very beginning.


That mean pimax will have 3 modes in their software

200 fov
170 fov normal lenses.
170 fov new lenses.

Normal lenses will use a lot of money to tune, not sure pimax will do it again while we not sure that people will buy the new lenses.


Or swappable lenses but that would mean a new headset design so hell no

No it won’t. You seem to think that by reducing the field of view with different lenses, you will get a sharper image due to the screen being smaller. This is not true. In reality the screens themselves aren’t changing physical size. While reducing physical screen size (hardware) would in fact create a sharper image with a lower field of view, simply changing the lenses won’t work. For your solution to work you would either have to have a way to push the screen further back from your eyes, (Might mess up the focal distance.) or create another version of the headset with smaller 4k screens.

1 Like

No, to channel the light from the two panels to 170 degrees would mean needing a different lens to do that. Because this lens bends it to 200 degrees. To increase ppi you need to bend the light closer together, which needs a lens to do that.

1 Like

Upon reflection, I’d like to have options for 170° with and without black bars.

Without the bars, there would be a magnification effect, not unlike my prescription glasses. I think it’s something you could get used to. It might be worth it for some games, like Elite Dangerous, which displays a lot of small text. I’m worried I won’t be able to read it (at least until I get my Pimax base stations and I can lean forward).

Regardless, I’m glad Pimax is offering this feature. I’m worried that my 980Ti will not be powerful enough for a reasonable experience. I’m waiting for NVidia to release the 1180Ti, before I upgrade.

I have the Pimax 4K and I can tell you text is perfectly easy to read in E:D. The 8K will be similar i believe

1 Like

I certain hope so. I’m worried that the wide FOV will contract the middle too much. (I’ve run FOV tests on my 4K monitor.) It won’t be an issue once I have base stations and can lean forward. Also, heavy super sampling helps a lot, but I’m worried that my 980Ti won’t be up to the task.

Well if you buy dual 4K screens you might want to consider upgrading your GPU. But you can check what you need when you get the headset, things might actually be cheaper by then or not but then the next version is closer to launch anyway

(edit, i suppose if the lenses are magnifying and the magnification was reduced with a different lens to use more of the screen it would indeed look better. That would probably require more software stuff though.)

1 Like

Just for better compression. When you talk about 200 FOV and 170 FOV you are referring to diagonal FOV. The same FOV measured horizontally would be approximately 170 FOV and 140 FOV.

That is to say:
200 FOV diagonal = 170 FOV horizontal
170 FOV diagonal = 140 FOV horizontal

In my posts I am always talking about horizontal FOV and the proposal of new 140 FOV horizontal lenses (which are equivalent to 170 diagonal FOVs that you mentioned).

I know that what I have just written is very redundant, but I prefer to write more and make it clear to everyone.

Now, to answer your questions:

Yes, effectively, by switching from FOV to software you are blocking/wasting pixels. Any software FOV change is basically an image cropping and will have the same pixel density.

If you change from FOV to software, the pixel density remains the same. The pixel density can only change optically, i.e. by changing the lens to a lower magnification or by changing the screen to a higher resolution, same technology (currently the pixel density on LCD and OLED screens is different, and much higher on LCD, but it also has its disadvantages such as worse blacks and less intense colors).

The advantage of changing the FOV to software is that the zone that is not drawn does not consume graphic resources. This is true if the viewer’s processing/rendering software is well done, of course.

For example and in theory, with a small software FOV you could use the GTX1070, with a medium sized one the GTX108080 and with a large one the GTX1080Ti.

@yanfeng, you are a tester like us, and you have the Pimax 4K and have made comparisons with the M1s, just like us. Your questions make a lot of sense because you’re looking for more perceived clarity, just like me, since you don’t find it in M1.

Hopefully other testers and backers will be able to test and compare the Pimax 4K with the M1/M2, realize that today’s lenses could be much better and that we are wasting an enormous amount of pixels uselessly that could serve to gain much more perceived clarity.

The problem I see is that Pimax has wanted to keep its promise of 200 diagonal FOV (170 horizontal FOV) at all costs, without taking into account that it wastes a lot of pixels of the screen that we could use to have more perceived clarity and that not everyone has a GTX1080Ti or money or desire to buy a GTX2080Ti, and that many of us work with GTX1070 and 1080.


I am curious which mathematical model do you use to derive these figures. I was trying to figure it myself not so long ago, and was not able to come up with planar, or spherical representation which would give (even remotely) those results. In my model, 200° FOV diagonal, must have higher than 180° FOV horizontal and vice-versa, 170° FOV horizontal, must have lesser than 180° FOV diagonal:


One of the last pimax updates was

“Now we have the new equipment that supports 190 FOV test, it definitely will help the engineers to test and correct the distortions at the edge, the accuracy will be largely improved.”

You tested the m1 with this new corrections? maybe dont need to be 140 and could be like 160 lens?

1 Like

Hey man, have you seen this? @SweViver @mixedrealityTV

This is how the lenses should look (but with proper distortion correction, and speaking of just the lenses.) These look like simple fresel magnifier sheets cut to fit an HMD. See how clear it looks even from far away?

1 Like

As this demo was made 5 years ago and these lenses still havent been confirmed working, Im afraid they realised there is much more into the distortion correction than just making a big-ass lens covering the whole interior of the headset.
It looks really interesting indeed, but while the guy is moving the head around, its clear the whole perspective and FOV looks totally wrong through the lenses.

What Im trying to say is - if this really worked - how come nobody has released that headset yet?


No news of the project in the last four years.


That’s why I believe they are using the eye tracking to dynamically adjust for the distortion. Also, it says in the video description/mtbs 3D forum that video was done without any correction done in software.

These are the lenses used in the infiniteye V1 (ie Star VR v1) I believe these are a variant of those although he stacked lenses with different focal lengths.


I think a stack of two or 3 of these.

What is amazing is how free of distortion it is looking straight at it. From the POV of the camera.

I think Pimax’ current distortion measuring equipment and software could correct for these issues, and this is all open source.

Indeed. It looks almost too good to be true :slight_smile: To the point where I start to wonder if it really looks that good while wearing it :slight_smile: Lets hope they will continue the development. Very promising!

1 Like

This thread is years and years old. Point is, dude in his spare time with fresnel sheets produced those results without software correction, or a kickstarter budget. He is also using LCD 1280x800 displays.

If you used these lenses in the 8K, im sure it would look brilliant.

What makes me doubt about the clarity of these lenses is the fact VR lenses are supposed to give an optimal view from a certain point of view. Why? Not only because its magnifying the image, but also because the lens is constructed so you are focusing at least 1 or 1.5 meters “into” the panel. You are not supposed to focus your eyes on the panel. The focal point is much further away. This is mandatory to make VR convenient to use for hours without stressing your eyes.

This kind of “flat” lens without further thickness and magnification, can definitely not have a 1-2 meter focal point. It just goes against the rules of optics.

Correct me if Im wrong :slight_smile: