Foveated Rendering at Driver Level for RTX cards!? If possible please implement it Pimax!

Someone in reddit linked to this:
Interesting new features on the Turing Architecture. (From Whitepaper)

It is an analysis of the RTX Turing features by a VR game developer “vblanco”.

Of special note is the part about Variable Rate shading. In it he says:

Variable Rate shading: This is also a huge feature, but its biggest use is for VR. It allows the developer to change the “resolution” of parts of the screen at will. The fun part is that the internal images are unchanged, so its not only extremelly easy to implement, but it could be done as a driver level toggle. With this, nvidia could make VR games automatically render at a lower resolution on the edges of the screen, giving you beetween 20% and 40% extra performance without quality loss. For every single VR game currently in the market, without the developer doing anything. Im not completely sure nvidia will actually do that, but given how the technique works, its definitely possible. If not, its still basically a “toggle” a developer could add with barely any code. Looking at the feature, it seems i could implement it for my VR games in barely a day. Page 43

He also adds at the bottom:

The variable shading will make the new cards extremelly efficient for VR, and ready for foveated rendering at the driver level, without the developer doing barely anything.

He also talks about many other great VR features of the RTX cards, but lets focus on this first.

So what if he says is true, then foveated rendering at driver level can already become a reality today!

First step is Pimax should be able to use it as option to reduce GPU power used at peripherals. Maybe add as a setting to enable it in PiTool.

Then when the eye tracking module is released, and an eye tracking module is present, they can use it to dynamically adjust which areas to use more/less GPU power for true foveated rendering!

This could greatly improve performance, which would then allow higher supersampling which would then increase image quality!

It seems Variable Rate Shading can even be used to increase quality at specified area.

Anyways, if this is really possible then I hope Pixax implements it at driver level.


thats cool yes i would love extra 20% very cool

1 Like

Very nice indeed. Wonder what Amd is going to bring to the table.


huh. thats really amazing. also this 20-40 percent probably assumes it being done on a 110 degree headset, i think the gpu saving on a 170 degree fov would be significantly higher. just look at the framerate change in tester reviews between wide and normal fov mode in 8k.


Wouldn’t it be even more simple if nVidia adds this option in their config panel ? Then Pimax doesn’t have to do anything.


i mean in an ideal world , but in reality the application (game) has to play ball. The big deal is it sounds like it makes it a lot easier to implement on dev end.


Yes, it may be that it has to be in nVidia’s driver. But I wonder if there is a way to inject the api calls in Pimax’s software/drivers. If there is then Pimax could add it and it could be automatically enabled without devs having to do anything.

“vblanco” just responded in my post about this at r/Vive. He says:

They have now released the Opengl extensions for this features. The variable rate shading is literally 2 functions. One to upload a texture with the desired resolution, and another is to “toggle” it soon. There is no further changes to a game engine to do other than uploading the desired resolution texture and set a toggle in the driver. Comparatively, multires shading (vrworks stuff) is a lot harder to implement.

That sounds very promising!
BTW, if you look at the whitepaper (page 43):

You will see what he means by “resolution texture” … it is just a texture that defines what level of shading (resolution) is needed for each part of the screen. For fixed foveated rendering it would just need to be set once. While if eye tracking is enabled, I imagine that it would be set every frame according to where the eye is looking at.

Would be wonderful if Pimax looked into how they can do this automatically in PiTool somehow.

That does sound very nice indeed! If enough developers implemented it quickly it would make even going from a 1080ti to a 2080 a big upgrade.

1 Like

He also said that he thinks this could be done at the driver level, but he’s not sure if Nvidia would do that.

I find it hard to to believe this is possible at the driver level. I mean we know the FOV’s of Rift, Vive, Vive pro, WMR, why wouldn’t Nvivda just add this as a toggle to the graphics driver? if it gave 20-30% performance improvement in VR at the driver level they would be shouting it from the roof tops. VR is the perfect target market for the 20 series, yet we hear nothing.

Yes, also puzzles me. They are so focused with the ray tracing stuff in the marketing and don’t really talk about these VR features at all. I guess we’ll see what happens.

The DLSS thing might also benefit VR performance hopefully.

More from vblanco:

Ive had a look at the nvidia samples for this. It can cut the amount of pixel renders by 60-70% without losing image quality. It also takes barely a few lines of code, and those few lines of code are given to you in that sample by nvidia, so having it work in something like unreal engine would be barely more than copypasting those couple lines into the correct place. The sample also includes “foveated rendering” by putting the fovea to where the mouse is pointing. Works like a charm. This would be amazing to try on the pimax with the eyetracking module.

1 Like

We def need something that does foveated rendering in more than just rtx.

1 Like

8K-X! 8K-X! Ha ha. I’ll get my coat… :laughing: :taxi:

1 Like

Pimax or game developers could actually do fixed foveated rendering right now on any gpu without eye sensors (via software) if they wanted and see something like 25% increase in performance.

Add to that DLSS and time wrap etc and performance could be pretty good!

1 Like