How would Dynamic Foveated Rendering with Eye Tracking in games actually work?

I think everyone is so caught up they have to pay 200$ they forgot to ask the important question.

What is the support for games etc?

6 Likes

It’ll probably utilize variable rate shading from the 20 series gpus.

1 Like

So on a driver level?

Would devs need to support it?

I don’t think so. It’ll work on a lot of games but for some, it won’t work. It’s likely in the same boat as FFR.

I would pay the 200$ if the implementation was really good with significant bumps in performance.

Otherwise nah.

It’ll probably be pretty significant gains for a lot of games. FFR already increases fps quite a bit in some titles.

You just has to imagine the current FFR implementation, but which will follow where you look.

FFR is a good work around, but in addition to the distracting aliasing outside of the center area, you can’t be too aggressive.
If that implmeentation follows where you look, you can probably always use the most aggressive setting and never have to look at a very low resolution.

I expect the same but it would be good to get clear confirmation if it will work on GTX series or not. So far the question was avoided.

2 Likes

They should ask how oculus and vive have pulled it off.

They didn’t, since they don’t have DFR (nor FFR) unless it is implemented directly by the game.

Vive said it would be agnostic and work on everything…but maybe they were just trying to sell more eye trackers.

If it is looking like the actual FFR at the edges, it’s a waste of money tbh…
Too distracting to shimmering…

1 Like

Yeah I turned it off instantly, I couldnt stand it not to mention I have rendering issues in some games (textures turn black in the downsampled parts)

I could be wrong, but I haven’t see any built in FFR/DFR support in Vive Pro Eye software yet. Yes, having eye-tracking will make all devs able to implement DFR in their game … but it’s not automatic either. Maybe HTC will ask Valve to include an option such as what Pimax did (Steam being the equivalent of PiTool in this case), but considering they did not since release, I suppose they won’t bother with it at the moment (If they did Pimax wouldn’t have bothered to implement their own).

You have the BMW M Virutal Experience which has been updated to take advantage of the eye-tracking, and surely some other “experiences” too, but it’s quite slow to propagate.

Having eye-tracking enables the possibility of DFR, but the end result depends entirely on how you implement it.

Pimax implementation of FFR only play with the resolution during the rasterization step (as far as I understood), thanks to the VRS tech that RTX cards can run effortlessly.
Simple and cheap, but that’s all Pimax can do at their “level” in the rendering pipeline. If you want to push this to the next level, it has to be implemented by the game developers (or at least the engine).

Look at the different methods nvidia was showing (the first method is the same as FFR in PiTool … with a bit of exaggeration for demonstration purposes):

7 Likes

Yes that temporally stable FFR is something I would look for, but can Pimax achieve this with the help of Nvidia or does the game itself need to support it…Thats the question…

That would be the shading step, which is what the “s” in VRS stands for; This is where the per-pixel operations occur and “skipping” can be done, unlike with the preceding rasterization.

What your linked video shows, is not the foveated rendering itself, but examples of different antialiasing/interpolation methods one can subsequently apply, to mitigate the lower resolution in the periphery, when “filling in the blanks”, where you have shaded only one pixel out of 2, 4, 8, or 16, and which you may not be able to resolve very well spatially, but will notice even more drastically even than in the fovea, due to the great changes in pixel values, both spatially and temporally, from frame to frame - you should also still notice the static pixel matrix…

A more recent video might have included DLSS, which pretty much reconstructs a “guessed” frame, based on differences between previously submitted frames from every conceivable view angle, of the environment, between a high resolution “ground truth” version of each frame, and a low resolution one, as expected from when playing the game.

I am not yet convinced that the FFR we can enable in PiTool actually manages to affect the game’s sharders, and not just PiService’s own compositor and barrel warp shader.

4 Likes

I have just watched the Oculus Connect Day 2 Keynote (which I recommend to watch to anyone interested in the VR technology) (https://www.youtube.com/watch?v=PMIDaomx0GA), with John Carmack not being very convinced about the (dynamic) foveated rendering happening.

I would not put much hope into that myself.

2 Likes

Of course it’s not happening. No one is supporting it. I’ve been saying Foveated Rendering is vaporware for years.

That’s why I want to know how this works and Pimax to show off before charging people 200$

4 Likes

Good luck with that. I am not sure Pimax knows how it works (or how it is supposed to work) though :wink:

1 Like

Well said man…