Using the eye tracking module to select the correct distortion algorithm profile

We heard in the last update that Pimax was having issues finding a one fits all algorithm for their lens distortion.
“With large FOV, it is hard to find the “one fits all” algorithm.”

“The distortion would be the result of several possible factors, including different FOV options, IPD, myopia/hyperopia, and the different distances between eyes and lenses. This is not an issue with small FOV headsets, but an issue with big FOV headsets.”

“Whatsmore, we plan to offer several settings for users to find their best fit. More information will be shared later.”

They mentioned they’d have profiles that you can choose from to find what best suits you. I’m wondering if the eye tracking could be used to automatically select the correct distortion algorithm for your eyes. Have you guys experimented with using the eye tracking module for distortion correction/distortion profile selection? :thinking: @deletedpimaxrep1

If this is possible, it’s another reason they should integrate eye tracking instead of a making a module! :wink:
AdHawk’s eye tracker would be a good fit for the 8k, as it has very low power consumption and is also very low priced. Reportedly 10$ per eye.

5 Likes

I also thinking of this, but not sure about distance between eye and lenses.

If they go with camera based, I can see it. But with AdHawk I don’t quite know. They seem to use a light that scans your eye soo…

simply no.

eye tracking can help nothing in that setup problem. It just detects the position of the iris relative to the surrounding area, a 2 dimensional representation. There is to my knowledge no indicator to tell apart the IPD from a change in distance of the eyes to the lenses. The distortion correction could be totally different for these cases.

The thing about the distortion correction is to compensate optical issues that differ depending on the distance of the eyes to the optics and display by different users and their anatomy regarding the eye balls.

IMO they would be better off designing a slightly more complex software setup, like with some TVs where you r guided to find the preferred picture settings adjusting some sliders…The distortion correction could be an individually calculated matrix rather than half a dozen of patterns.

The way most eye trackers work is by taking lots of pictures and using those to determine the center of the pupil. If they went with a camera solution, they could possibly use the pictures to determine your distance from eye to lens. If they went with the AdHawk solution, I don’t know. The way I understand it works is that they scan a light across your eye, and… that’s about it.

It would be great if the 8k was just eye tracking ready. Kind of similar to how Tesla model s is level 5 autonomous but you can’t use it yet as the software isn’t there. In other words it ships with the hardware but it might not be active yet

1 Like

Has the headset got a forward/backward adjustment to get the best personal distance from the lens ,or are we just using different face masks , I think the lens distance will also be important as the fused lens will surely require a certain angle /viewpoint to minimise any distortion .

edit
this will probably be mostly achievable with the ipd adjuster thinking about it

I think in the end they will use a third party solution. And all a independent eye tracker manufacturer promises is a coordinate at which position of the view or screen the eyes rest. An advanced version will tell different coordinates for both eyes. But AFAIK that’s about it. And that is already hard to get very precise and robust coordinates, while minimizing size and power needs.

IMHO: just drop the thought, there is not enough correlation between how somebody moves his eyes and the optical abbreviations. But show me an optician that uses some sort of an eye tracker to define prescription glasses and i will change my mind. :wink:

Once you involve eyetracking, it is not a “set once” matter, but something you do continously, for every frame, to overcome so called pupil swim (distortions caused by not looking through the lens along its axis, making the view warp as you look around, shifting the position of your pupils).

I think it would be sufficient to provide a slider or custom values for the distance between eye and lens which the user can adjust, and each distance should have a distinct distortion matrix. So the user can adjust this slider until the perceived distortion is minimized.

Adhawk is accurate down to 0.25 degrees and can run all day on a battery. We already know the eye module is 3rd party.

1 Like