I’m curious to know if they’re planning to improve performance a bit by reducing rendering quality on the peripherals. I feel like at 200° there’s a nice chunk of our periphery that doesn’t need to be rendered at 100%, so we could get some performance back by reducing the rendering quality on the edges. Or at least have an option for it. Obviously ideal would be Foveated rendering, but as we’re quite a ways off from that, this might at least be something to look into.
You mean like is built into raw data. in terms of technology if they did this at a driver level then its basically fixed point foveated rendering (i know this is bad name for it) at driver level. If they do that then all they have to do to have full foveated rendering is tie the point from which that occurs to the eye tracking rather than centre of screen.
I bring this up because i would suggest that means the bulk of the work would be in the former, and if they did go to that trouble it would be wise to do so in a ways thats coded forward compatible with the eye tracker .
Time will tell with testing & optimization. Though it might be possible once the eye tracker is released to use it to help with a more basic iteration of FoVeate rendering that might not need full game support.
Something along those lines. At least the extreme peripherys to start. Even if we only recover 10-15%, that’s more than worth it at these resolutions.
I like this idea. To me, it looks like only 20-25% of the screen would need to be “full res”. Games which support super-sampling, like Elite: Dangerous could, for example, render the entire display at 75% SS and re-render the central area (only) at 200% SS. I would think that would increase framerate and quality substantially. (Compared to rendering the entire screen at, say, 150% SS.)
Something like that probably needs to be implemented inside the game itself, which means it’s unlikely to happen.
Another game with a great implementation of Multi Res Shading is Batman: Arkham VR (in case you want to experiment with it).