I didn’t want this to get buried in the other thread.
VRMark Blue Room is a very interesting test for the future of VR. It renders a very complicated scene with lots of polygons, a spotlight casting shadows, and lots of AA. The render resolution is 5120x2880. This is probably in the ballpark of where you’d want to render for the 8K-X (2x native 4K panels).
Scroll down to the Blue Room results on this page:
Of note, FutureMark lists that a pass is 109 fps without HMD, 89 fps with HMD. So, with 99 fps on the 2080 Ti, we’re within spitting distance of a solid pass. Dropping the AA (because we want DLSS, right?), I think we can expect the 8K-X to be playable for new VR titles. (albeit, massive disclaimer about developer support here).
More information on the benchmark here.
Edit: @SweViver do you have VRMark Advanced ($20, needed to get the Blue Room test)?
I’d assume it would GPU wise. But I don’t think it stresses the CPU as much as some games would. Or if games rely on PhysX in the GPU, that could be a problem.
BLUE ROOM BENCHMARK
The Blue Room benchmark is a more demanding test with a greater level of
detail that requires more powerful hardware. In fact, as of October 2016, no
publicly available system running as sold is able to pass this test. This makes
it the ideal benchmark for comparing high-end systems that are limited to
90 FPS when running the Orange Room benchmark in HMD mode.
The Blue Room shows the amount of detail that may be common in future
VR games. A PC that passes this test will be able to run the latest VR games
on the HTC Vive and Oculus Rift at the highest settings, and may even be VRready
for the next generation of VR headsets.
Target frame rate
The target frame rate for the Blue Room benchmark when running in HMD
mode is 88.9 FPS, slightly lower than the 90 Hz refresh rate of HTC Vive and
Oculus Rift to allow for occasional missed frames.
On the desktop, the same workload running on the same hardware
achieves a consistent average frame rate of 109 FPS. The difference is
explained by VR SDK overhead and HMD refresh rate, as explained earlier in
this guide.
Implementation
The Blue Room test is a more intense test in terms of GPU load. It draws in a
higher resolution and has more geometry. This results in a higher CPU load
for preparing the scene and D3D calls. As a result, there is no CPU physics
load in the Blue Room.
Rendering work focuses on surface and volumetric illumination. The test
uses deferred tile-based lighting method with one volumetric shadowcasting
spotlight. The test features bloom effects and 2×MSAA as the antialiasing
solution with 16×Anisotropic filtering.
The rendering resolution is 5012 × 2880 (5K). When a headset is connected,
the rendered image is then distorted so that it looks natural when seen
through the headset’s lens and scaled to the native 1080 × 1200 per eye
display resolution used by both the HTC Vive and the Oculus Rift."
That’s pretty good for current VR, but you have to admit in terms of price per performance, that is not great.$1200 and the blue room is just about even with what the 8K and 5k+ use, not counting good supersampling. I think that even with these cards a 60hz mode would at least make more intense games playable (while also looking descent.)
Just saw the Star Wars raytracing demo on the Nvidia subreddit, via a Vimeo link and it was running 1440P DLSS 60hz on the 1200 dollar card, while dipping into the 50s. 1080 TI could do about 15fps, but that was at a NATIVE 1440p WITH RAY TRACING. These new GPUs wreak of snake oil.
In that Benchmark a 1080ti scores 66fps and 2080ti scores 100. Thats a 52% increase.
But…
In DCS for example it was unplayable even with a 5k+ because a 1080ti could only manage 35-40fps. A further 52% increase only yields 61fps…Not exactly great.
Hence my recommendation for a 60hz gaming mode on 8k and 5K+. These HMDs will require reprojection, and the cost will be prohibitive. I want to see AMD ignore ray tracing and just release a beast of a GPU.
Real-time raytracing is just marketing bs by nVidia, i saw a demo of 2080ti running the shadow of the tomb raider raytracing enabled and just 1080p resolution mere 30-50 fps and same goes for the Battlefield 5 so not worth of anything.
The new tomb raider however runs on my system at solid 60fps with 2k resolution and everything maxxed out without raytracing. VSYNC turned on to match my tv’s hz so might be a lot more actually.
My hunch is the 1080 TI will be more than able to run ray tracing if you run ray tracing at 720p resolution. Ie whole game runs 1080p with ray tracing resolution at 720p or lower.
Im gonna get a 1080 ti, to hell with these nee GPUs
Real-time raytracing isn’t completely marketing BS, it is probably the future…but won’t be widely implemented or entirely playable this generation, maybe the 3k or 4k series will have widespread adoption and the power to play at reasonable fps. I think I agree with your sentiment though, that it shouldn’t be the reason you buy a 2080 or 2080 TI.
I have a $1350 EVGA FW3 Ultra on preorder and it pisses me off, because I would never be buying this generation if I didn’t need every extra frame to drive the Pimax 8k, otherwise my 1080 runs my Vive Pro pretty well in most other games…
From the sounds of it the video @SweViver just put out today resolves a lot of the quality issues with the 8k. If you really feel that strongly about the 2080 Ti you could probably cancel your pre-order though you might be able to sell it for more than that.
Yeah you’re right, but so far it’s just for showing it off without any real use. However a console player might be satisfied with the 30fps/1080p raytraced.
We almost had some real benchmarks for the 2080 ti but apparently they had an issue with their power supply.
VR?
If you want a solid plug-and-play GPU for a higher-res VR kit, particularly Vive Pro, the RTX 2080 will get you there. The same can be said for the 1080 Ti, however, and tests of both simple and complex software on both of those cards resulted in similar results: stable 90fps results at “medium” and “high” settings, depending on the game, along with nasty pockets of slowdown upon pushing certain games’ visual or anti-aliasing limits.
Unfortunately, the RTX 2080 Ti combined with Vive Pro presented a surprise problem: consistent power spikes that brought our testing rig’s 650W-rated power supply unit (PSU) to its knees. Replacing a PSU on our small-form-factor Falcon NW machine threw up problems during our brief five-day testing period, so we had to put this test on the backburner. Expect an update soon; we assume the 2080 Ti is a VR beast, but we’d love to confirm how it boosts, say, the sometimes clunky Fallout 4 VR. For now, let this note serve as a warning: Nvidia’s assurance that 650W PSUs are sufficient for the 2080 Ti shouldn’t be taken as gospel.
There is a single 2080 Ti benchmark and four 2080 benchmarks online currently. So far 2080 is +1% better than 1080 Ti and 2080 Ti is +29% better than 1080 Ti.
Yeah, without DLSS the 2080 is a 1080 Ti that’s $200 more expensive… that’ll produce film grain artifacts on RTX games in 2-3 years when said games become a thing…
With DLSS though, the 2080 should stomp the 1080 Ti.
Thats not a gaming benchmark, in actual games the 2080ti is showing up to 50% gains over the 1080ti - deoends what games you play but that synthetic benchmark doesn’t seem to be a great comparison for VR games
It won’t ship til next month so I have a little time to cancel the 2080 Ti if the 8k performance will be ok with a 1080Ti (if so I will buy one for less than half the cost of the 2080 TI), and who knows, maybe AMD will pull off a miracle and match the 2080 series with standard graphics at much less price later this year…
On top of that the 2080ti can render on the *driver *side Variable Rate Shading when can gain you up to another 40% performance. It’s essentially driver side foveated rendering that you can theoretically use without any developer support.