Waiting for AMD to release good stuff so I don’t have to deal with Nvidia Scum.
If you actually look at the $$$/clock and similar metric, the pre-binned cards are not that exorbitantly expensive. You are still getting what you pay for. The ‘professional’ cards (TITAN, Quadro and such), are where things get stupidly expensive.
EDIT: By the way, just did some calculation for the TITAN RTX. For about double the cost, performance should be less than 5% better, assuming best reported overclockability. Now that’s stupid expensive. Subzero chilling would be a better investment.
@noro
I think eye tracking is one of a combination of things that will make a maybe 25% overall improvement in some, but not all VR apps. Double framerates will happen when a combination of things are done, perhaps including both new GPUs and eyetracking, but for the simulation apps under the most pressure, only when the apps themselves are optimized to take advantage of the new tech, maybe even multi-GPU. This will happen, but it’s not going to be a case of ‘this magic clickbait hardware brick solves all VR performance issues’.
Though it could be a case of just buying a GPU if NVIDIA would just offer us a card with twice as many units at an even close to fair $$$/unit price.
@mr.uu
Keep in mind DCS World performance in particular has become about 10%-20% worse in the past year. Mostly due to a few updates in particular, but still. Give it a year and a half… I hope enough 8kX units get into the field soon enough to convince developers to be more careful about their VR performance margins!
Here are the benchmarcks
watch the difference between 980 ti and 1080 ti
because 980 ti is 28 nm and 1080 ti is 16 nm
and now we are going from 12 nm (2080 ti) to 7 nm (3080 ti)
Sorry I missed addressing that point.
The 980Ti is two releases behind, and never met the ‘threshold’ I consider to be wise for Pimax.
The 1080Ti is still a reasonable choice, though just below the point of diminishing returns. Visual quality will be a bit worse in the 8kX than it could be, and the lack of RTX features will limit workarounds like FFR, but it is still a reasonable overall compromise.
The 2080Ti is at the point of diminishing returns, both in cost/performance and supersampling for an 8kX. This makes it great for the flight sim community in particular.
The 3080Ti/3090 will improve things a bit, but it won’t work miracles. Even less so than the 2080Ti vs 1080Ti. The 2080Ti already reached the point of diminishing returns in supersampling/resolution, and apps that need Smart Smoothing now will continue to at that point. Additionally, it is likely software regressions will eat up some of these performance gains unless developers start to prioritize VR smoothness.
You’re assuming a lot based on your one experience with dcs world. Most games improve in performance over time and updates, not degrade.
Not assuming so much. I reference DCS World often because it is one of, if not, the least performant VR application.
Like I said, some games may see an improvement in framerate, but not all, and usually not so much that we can just cast off Smart Smoothing at 90Hz, etc.
Actually, no. Games used to improve in performance. The reverse has been true for a while, especially not helped by the trend toward forced updates.
Those performance ‘improvements’ were because, before Unity/Unreal/CryEngine mostly took over, game code tended to be so bad at properly initializing things, NVIDIA/Radeon drivers were realizing something like 20% gains per month for several years straight, just by correcting what the game code was doing wrong. This pattern continued roughly until Crysis performance stopped improving a few years after its release.
One of the things that has surprised me most about VR applications has been their ability to steadily deliver frames within 10% of the margins for dropping frames. To make that happen, in addition to using game engines with more careful coding practices, some timings in the pipeline have had to be adjusted in ways that, early on, were both hardware and software specific.
There have been some academic articles documenting both these developments, but I have long forgotten where they are.
With all that done, there is little room for VR apps to ‘improve’ performance, short of actually fixing things like (lack of) multithreading, insufficiently optimized geometry, netcode interfering where it shouldn’t be involved, etc.
Besides dcs world, please show some evidence of vr games that have decreased in performance since launch. I have only experienced increases… notably lone echo, hl alyx, beat saber, the forest, no man’s sky, red matter, among others.
None have increased for me. IIRC Elite Dangerous decreased a bit, especially on account of some volumetric effects becoming mandatory, but not enough that I couldn’t find workarounds sufficient for even the 5k+ (though FDev is now so bad they are just dropping VR). PavlovVR has been the most notable decrease, at least insofar as newer maps, especially workshop maps, have been increasingly likely to encounter severe stutters (with updated maps not just new maps), and now the 8kX (remember: reduced supersampling) is barely able to keep up.
Hi Matthew, could you please make up your mind
Not out to get you in any way, or offend you but if your RTX2080Ti is barely able to keep up, why should we not fawn over the RTX 3080, 3080Ti cards
I think I have been clear about this from the start.
The RTX 2080 Ti does keep up with all my worst cases, withing about 10%-20% of frame dropping. DCS World, 10% margin at ~1.5x TotalSR on the 8kX, PavlovVR 20% margin at 1.5x TotalSR on the 8kX. Of course, PavlovVR does not need to look that good but wow it is pretty.
Over a year and a half period, in the worst cases, I tend to see something like a 10% drop in performance. Throw in 20% for a bad SteamVR update or extra PiTool feature, and that could be 30% total.
So, if RTX 3080/3080Ti/3090 is 20% better… ultimately we might gain or lose 10% overall. Mostly, it’s a big hole in everyone’s finances (of course that’s what it’s really for).
Of course, we could get 40% better, with no losses. But then we still aren’t doubling framerates.
I can guarantee THIS demo will be the hardest for anyone to run in vr. I have a 2080ti and can barely get 30fps. Any and all improvements next gen will be welcome. I demo this game at conventions and the poor performance takes a toll on people.
I will look into this. Ping me if I forget.
The stated system requirements (for Vive anyway) are very modest though. Pimax extra FOV should not add that much GPU load, and the higher resolution should help (reduced supersampling).
Nvidia GeForce GTX 970, AMD Radeon R9 290 equivalent or better (2GB or 4GB or more memory)
seeking dawn is way more demanding
The problem is the demo is just not optimized in any way. The devs have even admitted to it saying the demo was meant more as a proof of concept. I dont know how accurate those setting are but for my setup at least its impossible to get a decent frame rate.
Now that you mention that one - I guess their change from Cry to Unreal could be about to pay off, come UE5; Lots of high-poly static environments to go all Nanite on…
Yeah I cant wait to see what they can do, their work is some of the most photo realistic Ive ever seen a video game.
I found it works great for me, but then iirc you said you refuse to turn down any settings from maximum for anything, so…yeah
Exactly
Nice to see you remembered that this was my case
First benchmark seems to indicate 31% performance gain over an 2080TI FE https://hardwareleaks.com/2020/06/21/exclusive-first-look-at-nvidias-ampere-gaming-performance/