Extreme Pincushion Lenses with LMS for less wasted pixels

I’ve overlaid the center 60 degrees from Vive onto Pimax from the dumps provided by the Pimax team. You can see how Vive actually has more center horizontal detail as the wall is wider. The vertical FOV was not to spec so I had to stretch it approximately to show how Pimax wins in vertical detail by a large 70% margin.

With a linear projection like Pimax you end up with an uneven pixels-per-degree that wastes a huge amount of screen real estate on the periphery. Assuming a monocular 160 FOV, the average oversampling relative to 1 PPD is: tan(80) / (80 x tan(1)) = 4x, with about 90% of that in the periphery beyond the center 60 degrees, so Pimax is wasting 3.5x more screen real estate than it should on the periphery. Alternatively, you can look at how bad 1 degree of real estate is at the peripheral edge: (tan(80) - tan(79)) / tan(1) = 30x center! By comparison Vive’s peripheral edge is only: (tan(55) - tan(54)) / tan(1) = 3x center. The end result is that the center view, which is what you’re looking at most of the time, loses horizontal detail down to 1/(3.5/(100/60)) = 45% of average, giving (4K/160)x45% = 11 PPD, roughly Vive level!

EDIT: Pimax team say the monocular FOV is 150 (they are sacrificing some binocular vision) which results in a horizontal center detail of 60% average, giving 15 PPD at native resolution, that would be noticeably better but the upscaled version that most people are buying will still be at Vive level 10 PPD.

One solution I’ve thought of is to make a lens that approximates an inverse fisheye via an extreme pincushion to counter the PPD stretching of the linear projection. Interestingly, Vive does this “accidentally” due to the pincushion caused by magnification! If the pincushion is extreme enough the center could be made to have even more detail than average, maybe up to 30 PPD or 3x Vive (not too much or else the periphery would have visible SDE).

Once you have such a lens, you can render with it efficiently via Lens Matched Shading. LMS works by approximating a barrel distortion during rasterization, this is much more efficient than brute-force supersampling like the Vive. Pimax would need LMS because the extreme pincushion would otherwise require around 2.5x horizontal supersampling (and 1.4x vertical). The LMS that Nvidia implemented isn’t actually pushing the hardware to its limits, it should go up to 8 projections per eye for a more accurate barrel, which would look similar with squarish viewports tiled across Pimax’s wide 4K display. LMS requires support in game engines and hardware, but I believe it’s already in Unity and Unreal now, and the Pimax 8K X is only really intended for Volta / 1080 Ti so the hardware is there (AMD’s next chip will probably support LMS).

Is this feasible optically? Maybe the Pimax team can look into this for the 8K X and work with Valve to add necessary support in OpenVR?

4 Likes

I think these guys can help you out. Or, at least get you where you need to be.

@PimaxVR @deletedpimaxrep1 @Matthew.Xu

1 Like

I’m wondering if Valve’s new lenses do this, or at least help use more pixels for where we look most of the time. They haven’t detailed anything about them.

Maybe it’s what LG was waiting for…