Eyetracking When?

https://community.openmr.ai/t/about-the-pimodules-accessories-category/24427?u=heliosurge

and well said so we will have everything, soon, it will be abstract soon, because frankly I was a backer (942), I gave more money 100 euros for eye tracking, and I have so far only the helmet, well soon, what have we been super extremely patient, Pimax could be generous at least with these backers, this would seem really respectful! And good day, March 2020 … and they waited again and again … obviously, I put aside, the problem of the coronavirus …

4 Likes

We should hopefully have an update this week.

3 Likes

wat u need i trakin 4?

is totally useless rite now

Pimax is not Nvidia

1 Like

This.

Unless you are an active developer, eye tracking is (AFAIK) not supported by any games, VR applications, or any foveated rendering drivers. Even if it was supported for foveated rendering, that is just not necessary, since current graphics cards are already adequate for existing applications.

1 Like

NVidia and AMD are never going to implement any features for eye tracking before there is a decent base of eye trackers out there. We wait for them, they wait for us. Let’s break the status quo.

3 Likes

NVIDIA… AMD… do things at the driver level, which we don’t need, because the hardware is basically adequate.

We need real support to do cool things with eyetracking in game. Like navigating menus just by glancing at them.

2 Likes

One thing I think many forget is that Eye tracking does have other uses than just for foveate rendering. Like menu selecting as demoed by Fove and Tobi.

This was already done before in menus.

On a side note. We don’t know if the op is into Development of aps or not.

Sorry, I thought you mentioned foveated rendering. It will bring better graphics for the same hardware. That’s not a prediction, it’s a fact.

1 Like

I was talking to @SweViver in Gouda a couple of months ago, and we discussed that eye tracking might be able to help bringing down distortions. It’s very hard to have a good profile over a wide FOV, but if you now where the user is looking at, you can use the most appropriate profile from a range. Different profiles for looking to your left, right, and straight ahead, and possibly a few inbetween. I’m sure it would not be easy to implement, but the approach seems promising. Just my opinion as a non-expert.

Other than the technical possibiities though, I do agree that we need killer features to help it forward. Whack-A-Mole by itself is not going to push the envelope :slight_smile:

3 Likes

Xunshu mentioned during the m1 beta summer that the team was evaluating dynamic distortion correction with eye tracking. :pi_thumbsup:

3 Likes

I’m sorry, but the buyers of the eye-tracking module are interested in foveated rendering primarily to increase performance as much as possible; current graphics cards doubt that they will achieve the native 75 fps of the Pimax 8KX with a high level of detail. Didn’t Pimax say that the module achieved a minimum of 25% more performance? Which I think, for Pimax’s FOV, is very little.
For example, Oculus is trying to increase graphics card performance by 67% using eye tracking and AI.
I am a developer and will buy the module (if it helps me) to test using eye tracking for use in games with Vive Eye and the predictable Oculus Quest 2 that could use it.

3 Likes

Now HTC has discontinued Vive and Vive Pro and will only be making Vive Eye, so all Vive’s will have eye tracking as well as Vive Cosmos or Vive Focus. Pimax has one module, and it seems to me that Oculus would start to add this feature in their models, even more if HTC does.

1 Like

Latency on current eye tracking is too high for it be useful in gaming because in intense scenes people look in all directions constantly.

At least that’s what I heard.

3 Likes

By the time eye tracking foveated rendering can be implemented, even better graphics hardware will be available. The impact will ultimately be minor compared to the use of eye tracking as an input method.

Also keep in mind foveated rendering imposes some costs of its own.

2 Likes

Neos VR already has eye (also full body) tracking for the Vive. I’m sure the dev would add support for the Pimax eye tracker once that comes out. In a metaverse style app, full body tracking, including eyes really adds to the immersion. The dev was recently demonstrating lower face/mouth tracking as well.

1 Like

Well with better hardware that performs well, we would be able to make HMDs with higher resolution and/or higher frame rates, right?

It’s not a matter of input method versus FOV, they both have advantages that come available with eye tracking.

The overall costs are lower, from an image quality per gpu hardware perspective. And anyway, it’s only optional - when it’s an option.

let’s agree that there are several possible advantages to eye tracking. I really hope we can start exploring them soon.

3 Likes

I’m not sure that the current graphics hardware is yet to the point which would make dynamic foveated rendering acceptable (without ugly artifacts). It does depend on the game and your tolerance of flicker.

I used fixed foveated rendering on conservative for over a month. I never really got used to the flicker at the edges of my vision. Even when looking straight ahead (through the normally rendered area), I would often see noticeable aliasing and small details (just a few pixels wide) flicker in and out of existence around the periphery. I found it distracting and immersion breaking, even though it provided a significant framerate boost.

My personal feeling is that DFR will not reach its full potential until your GPU can fully render the whole visual area in “normal” mode. Then, with additional GPU power, you can crank the super-sampling in the foveated area, to increase the visual fidelity. nVidia has already implemented this for FFR in some games.

3 Likes

Cost of foveated rendering is on the CPU single-thread side. Whereas GPUs are continuing to improve steadily, CPU single-thread performance has more or less plateaued for many years now.

For performance, I would rather have usable SLI support than more foveated rendering hacks.

Eyetracking will prove far more novel for things like menu selection.

1 Like

Then the foveate software needs to be optimised for multi core cpus.

FoveVR already had viable foveate rendering with eyetracking but was too ahead of it’s time and too propietary unfortunately.

However we see this all too often. Companies having the vision for the future too soon. Amiga comes to mind with design and being a multimedia user friendly pc.

I’ve tested the eye tracking module and it does help considerably with rendering, you get several settings similar to FFR to adjust the effect. Games that naturally have a low frame rate typically benefit the most.

Of course the system offers more features than just that but those must be specifically coded for at the moment (like menu selection etc.)

We should have more information soon regarding production to incorporate into one of the next few weekly updates.

7 Likes