I think that ET+FR is still not solved problem because AMD and Nvidia aren’t happy with the idea to reduce gpu requirements an order of magnitude. That is how many generations of gpus? Money to be extracted from pcmasterrace… Sony and Qualcomm may push it tho, Sony to try make PSVR2 wireless, Qualcomm to close huge gap between SoC perf and pc gpu.
You’d still have/get more GPU power to turn up settings or have higher resolution in vision. So while it wont be higher framerate, it will look much clearer/better
Not sure how Pimax is doing their FFR. It seems like it’s more of a hack. We have had FFR on the Vive/Rift with the Batman: Arkham VR which is using Nvidia’s FFR api. On the Quest there is also an API for this that the game/app must tap, its not automatic. Most do FFR since they need every trick to get their apps running good. Quest hardware is all the same and is a known target.
This will be same for Eyetracked fovated rendering. The application must be coded for it. So the chance that a lot apps will support this with out a HMD having this built in is next to nothing
They’re using variable rate shading. It’s a hardware feature on the 20 series gpus. Any game that works with Pimax’s fixed foveated rendering is extremely likely to support eye tracked foveated rendering. It’s as simple as changing the areas of low and high res to match the location the eyes are looking at. It may be that some games can use variable rate shading, but aren’t utilizing it. That would explain why some games work and others don’t. Who knows?
This is NOT True. The application needs to use API’s to do this.
That being said the VRS Helpers Wrapper NVAPIs make it a lot easier to program your app to support foveted rendering on Nvidia cards
If Pimax can change the presets for FFR (which they likely can), I wouldn’t be surprised at all if they can modify the presets to suit their needs. As far as I know there’s nothing stopping Pimax from editing the areas of foveation themselves.
Now that Pimax China is back in the office can we get clarification on how this stuff actually works?
FFR already work without explicit support from the application itself. It’s easy and cheap, not completely satisfying (distracting aliasing), but that’s what the extent of what they can do at their level in the rendering pipeline.
It wouldn’t be surprising to see it work as a simple DFR solution thanks to eye-tracking.
That doesn’t mean you don’t need a serious implementation from the developers if you want to go further.
Well what ever pimax is doing with FFR is a hack (which is fine) I think we will need a standard for DFR to take off. Something in Directx/Vulkan so it will work on any GPU. Looks like oculus is doing something like this with quest and Vulkan but its targeting the qualcomm gpu Developer Insights: How to Develop with Vulkan for Mobile VR Rendering
A standard would obviously help, but since there is not that many rendering engine used in VR, but in the meantime, it would help to have at least existing custom solutions implemented in Unity/Unreal which cover most of VR games out there.
So vulkan has this but its targeting Nivida cards
AMD needs to jump on the variable rate shading band wagon!
Or games need to use direct X 12
https://techreport.com/news/34521/microsoft-standardizes-variable-rate-shading-as-a-part-of-directx-12/
So oculus added this to quest this week. do we have this feature already ? if not when?
I know we have Foveated, but the way oculus does it is that it only turns on when you need it.
I heard something about this today, too…
No eyetracking in the Quest (afaik), so I guess we now have to distinguish between two aspects of “dynamic” FV: Eyetracked, that follows the user’s gaze; And potentially still positionally fixed, just like current FFR, but performance-adaptive, which begins to encroach progressively on the view from the periphery, as needed, just like the adaptive render resolution we have seen so far, in a few titles, and shrinks back when performance headroom returns.
In a way we already had a simple variety of this: In the Aperture Science Robot Repair thingie, where the lowest quality down step includes everything outside a radius in the centre getting checkerboard shaded. :7
They added a variation of Fixed Foveated Rendering called “Dynamic Fixed Foveated Rendering”, which essentially just selects the level based on performance…
I did a feature request here:
https://community.openmr.ai/t/oculus-quest-gets-dynamic-fixed-foveated-rendering-to-balance-quality-performance-maybe-a-feature-for-pitool-also/24563?u=drwilken
maybe they will take it seriously now that oculus has done it.