How would Dynamic Foveated Rendering with Eye Tracking in games actually work?

I mean why else would you get eye tracking? Seriously what other reason is there?

DFR is one application of the eye tracking. The other is the control of the objects VR. Think aiming with your look for example. But I would not put much hope in that either, because unless it gets massive support from the headset manufacturers it will be a niche inside a niche and no one will bother to implement (especially if it will not work 100% perfect). Something like the handtracking now.

That’d be pretty cool for an Ironman game. Can’t see it being in many titles.

Watching that stuff it paints a picture of quest = developers $$$
so not much chance of any high end visual games to use ffr for a year or two anyway

I mean running dual 4K at high FPS is still going to need FFR.

The one that Pimax could do entirely on their own, is to adjust the barrel distortion to compensate for pupil swim when you look around.

As for things that need implementation in applications, you have all manners of interaction, from gaze-based selections (EDIT: augment Elite Dangerous’ current HMD_orientation_raycast highlighting of entities in space, with reliable eyetracking, and I’d be delighted), to NPC’s reacting to where your attention seems to lie.

In addition to the pupil swim correction, there is actually enough parallax just from swivelling your eyeball, that adjusting the game cameras accordingly, instead of just facing forward all the time (EDIT3: Ok, epically bad wording - they’d still face forward with adjustment, but shift to the side as much as one’s pupils), and giving you that view to look around, should potentially add a not inconsiderable (EDIT2: …but subtle…) bit to precence.

2 Likes

First I cringed a bit when I noticed that pattern in the speech, but then I realized that he actually made good points about it, usually supported by technical insight about the design decisions etc. So in the end it was actually very insightful speech from someone who understands how the tech works, and knows how to sell it by utilizing its full potential.

The quest 2 looks like it will be the 2020/21 vr iphone and he said that everything should always be backwards compatible , hopefully some developers will remember that other platforms exist

Not sure that’s possible, or at least, not without a noticeable cost in performance I think. No proof about that, this opinion is only based upon the fact that if it was that easy … it would have been done already.

What is fovated rendering if not rendering based on where you look. Sure the blur effect is here to mitigate the downside, but the downsides of “simple” fovated rendering.

Talk about blind negativity. Of course no one is supporting it, since there are basically no hardware to use it (sure Vive Pro Eye and FOVE headsets … but how many are in the wild ?).
Once eye-tracking find their place in more and more headset, then you will see developers support it.
What Pimax and other VR headset makers can do by intervening that late in the rendering pipeline is obviously limited.

Like some software developers has already shown (I shown an example for Vive Pro Eye in my previous post), you have to implement it directly in the game/experience if you want more control and much more good results.

Pimax FFR/DFR is just a “band aid”
for galmes that doesn’t implement such features yet (and why would they as long as the eye tracked headset market is so little … a niche market of a niche market).

They are all the same simple foveated rendering. You termed them different variations, rather than the same with different post processing. I only clarified.

I’m not being negative it’s simply a chicken and the egg scenario and both don’t exist.

PS. They still won’t.

By the way, thanks for all the other clarification. I forgot to do it d(^_~ )

1 Like

just wanted to throw in here, that even if DFR works perfectly, there still is a “however slight” delay when switching a lowres-part of the display into highres.
you are guaranteed to see at least 1 frame of lowres everytime you look somewhere.
i would be impressed if they can keep it around 3 frames tbh.

Carmack is a genius as well. If he says it’s not happening that’s bad news.

Unless HTC were to sponsor an experience that showed dfr at it’s best which is unlikely , oculus are not about to do it

Although you are reasonably right about the 1 frame delay, it shouldn’t be mandatory (should depend on when the information reach the step in the rendering pipeline where it has an impact) since most eye-tracking module used or planned to be used are at least 120Hz.
I don’t see why they should be 3 frames to be honest.

If it’s a 1 frame delay, your eye probably wouldn’t have enough time to re-focus in order to see the pixels.

From a the perspective of someone playing a game that is CPU-limited, I doubt DFR actually has any benefit at all.

Can you link that speech Video?

you’d just need to make the render a big enough area that the eye can’t move out of it between two updates. eye movement is fast but not that fast.

1 Like