Poll: Plastic aspheric lenses for the 8KX (or 8KX 2020 Special Edition); and 5K/8K RE with 8KX lenses and 1080p input for better upscaling

its not about some one from the internet or a few backers to make pimax business decisions, imho you vote is pointless
also the RE editions are more targeted for VR arcades so if pimax has an ear for something like this they will ask potential customers from VR businesses
it’s not simply about what’s possible in theory, you need to have a plan and calculate money to reach the production phase or reach a targeted price for the product - see the XTAL? nice headset but way to expansive to be relevant for steam vr gamers or star vr …?
its not that easy and its the “package” that counts, valve also did not go for wider field of view to lower the cost and get easier/better user experience, if pimax already is testing eye tracking to improve lens problems then why shelling out money for new lenses? we don’t know about the internal processes so you might fuss about things that are already decided month’s ago
there are a lot of things to juggle for pimax (like expensive eye tracking modules they promised for backers and no real use in steam vr) they will not invest huge amounts of money for lenses if its not absolute needed and as long as there is no star vr one or a cheaper xtal gamer’s edition they are the only one with wide fov and if they think they sell good enough with the products they have …

4 Likes

@IG88 In the case Pimax doesn’t manage to get a good enough aspheric design in time, I’m sure mostly everyone would support seeing their improved fresnel design referenced in the 8KX specification video being implemented in future headsets, such as RE or pro.

The only change I could see making sense in the near term would be a 1080p mode. I have advocated for aspherical lenses for a very long time, but it is much more expensive as an optical element, and heavy.

Pimax would need a long time to prototype lenses, run them through a test phase, do quality assurance, retool, and ramp up production on them, if they even found any experiments with such optics successful.

Changing a core component in a mass-produced product is like trying to stop a freight train with a sheet of toilet paper. Only if you had about five years deep worth of toilet paper Could you actually stop the train, then you have to restart the train, gain new Steam, and chug on.

What I do think would be possible in software would be a 1080p mode where you slice the screen up into maybe 3 or 4 bands, and you do sliced time warps at 1080p.

For example, you render the far left portion of the screen @1080p, hold in time warp, center at 1080p, hold in time warp, render far right .

Picture a band moving across your fov horizontally at such a fast rate that your eyes don’t notice. Because you could send 1080p, you get rid of the scaling issues, keeping one to one and you get more detail overall.

Imgur

maybe @SweViver can pass this along? @kons what do you think?

2 Likes

Aspherical lenses can be manufactured cheaper with optical plastic, but I see what you’re saying. Maybe an 8KX SE or RE with those features in 2020, I’d pay $2000 for that.

Its not so much the material you use, (yes optical grade acrylic is cheaper.) Its all the reflective coatings, and tooling costs to get an optic manufactured at scale. The current lenses as they are were very expensive to produce.

The rendering concept could be done in software on all devices. I wonder what @neal_white_iii @LoneTech and @Sjef think of the practicality of this rendering idea?

I’m pretty sure they do sliced time warp on Gear VR, I just don’t think ATW has been applied how I am suggesting.

2 Likes

@kons I didn’t know starvr had fused optics and used 4 displays? That sounds like a good idea but is it possible to fuse optics without creating a noticeable joining line?

That sounds like a good idea for their next headset, which should be the 5KX or entirely different nomenclature.

Trying to fuse Optics with out a joining line is very difficult as I understand it, just from the point of view of yield.

2 Likes

Also it looks like the Starvr rendering approach doesnt give you the true peripheral images, they dont appear to be stacked over each other correctly, as if the peripheral screens needed to be rendered from a point of view slightly behind the centre render boxes.

Wouldn’t the tearing look bad in that 1080p mode where you have 4 slices rendered to a screen at once? @kons Is this what StarVR does or that blanded views you were talking about? I suggested 1080p input as it would accurately upscale to 4k compared to 2.5k which is not an even profile or ratoo compared to 4k resolution.

You would assume it would refract light in this kind of angle using 3 or 4 layered lenses in triangle shapes



. > <



Maybe they’re using cantered lenses as well?

You wouldn’t be displaying all of the views at once, but in succession. Think of it kind of like interlacing, except that instead of fields of individual lines, you have frames rendered in succession across the field of view. Your brain will blend them together.

Pimax was suggesting something like this for brainwarp 2.0, I’m just expanding the idea a little bit.

2 Likes

Are StarVrs lenses canted? I know their screens are.

@Mantidtings you did not understand me

Pimax 8K X has a lot of potential
They are not changing fresnel lens in 2019
Ok.
But if they explore the possibility of a mod
to make a Pimax 8K X special edition in 2020
with better lenses for example aspheric lenses
that is their “know how” (to build lenses)
I TELL YOU THIS IS A GREAT MARKETING OPERATION
mainly
If you might send your Pimax 8K X in factory
for the upgrade.

1 Like

The rendering idea sounds like reinventing progressive scan as opposed to global shutter, using composited layers of independently motion smoothed slices. It will cause tearing and won’t improve performance demands as much as you might expect, as you need an overlap to handle any observer motion. And on top of that you now need to run motion compensation searches for many more streams.

The slicing does have the advantage that you can turn each slice to reduce the difference in pixels per degree, which is why it’s used in e.g. panoramic rendering. It would be a pure win on a GPU capable of arbitrary multiview; unfortunately the geforce 10 family only does restricted 8 view (two main views separated by one variable, e.g. parallel projections, and a split in four to roughly approximate a centered lens - both designed for small field of view headsets).

On top of that, Oculus and Valve both decided to make the API for single view per eye, unlike OSVR. In OSVR you could pass all the views to the application in one pass, so it would know to render them for the same time.

I haven’t spent the time to dig deeply into how OpenXR allows it, but at first glance it seems to have done the clumsiest option of specifying fields of view for two eyes specifically and reprojecting later (possible output reprojections include a cylindrical extension).

2 Likes

@Mantidtings. First what you refer to as “aspheric” lens is meant to be “solid” (or “full” or “clear”) lens in a sense as “non-fresnel”, but using aspheric is misleading (as already pointed out by for example @LoneTech) , because it refers to the particular shape of the lens surface (and thus defining also its optical properties), compared to the way this surface is implemented. Fresnel lens can implement both spheric or aspheric surfaces equally.

Second, the fresnel lens was invented to actually help with the problems the “full” lens get when becoming too thick, which means when covering large FOV, or magnification, it becomes heavy (not so much for the plastic), bulky and also more prone to optical imperfections cause by the increasing mass of the not totally homogeneous material.

Incidentally VR lenses have to cope with all those problems cited above, so the fresnel tech was the logical step to go. You may point out that there are non-fresnel applications in some HMDs already, but they are either insignificant (low FOV = low mass) as in GearVR, or possibly much advanced tech (high index highly homogeneous glass, with high precision shape) as in Xtal. I do not believe that one could get the latter in a cheap plastic version. If it was possible, we would not need the heavy, bulky, glass variant in the first place.

4 Likes

Didn’t Abrash mention pancake lenses as the next step after fresnel?

2 Likes

For what I remember, Abrash only said that Fresnels are only good up to certain FOV (~140° IIRC). Not that clear lenses should be the solution for the larger FOV. But if you have a link to the statement I would be interested to know.

1 Like

He likely did (and Carmack proposed interlaced rendering, which may work fine if we move the motion compensation layer to the headset). This is metamaterial territory, a quite interesting ongoing research topic. Flat lens - Wikipedia (There’s a large difference between classical prime lenses, frequently known as pancakes in photography, and flat lenses; I expect he was talking about the latter.)
Actual working results seem to date back as far as 2012, but I don’t know of any in mass production, nor the limits in e.g. size and refraction. It’s a future step for sure, but so is light field displays which may not even need lensing.

1 Like

I followed up on a thread on Carmack’s twitter about interlacing, and Tom Forsyth mentioned that they tried it, but artifacts from head movement were too much.

I almost wish we had FED or SED technology, as that (micro electron emitters) have almost no display lag, and would not even be fixed pixel displays.

The fact that you would have a device under a vacuum strapped to your face is secondary lol

1 Like

Do you have a reference?

1 Like