That’s funny. For me I always got the cheapest card. Right now I have the 2 fan EVGA 2080 Ti black and I was a little bit disappointed in its performance improvement compared to my 3 fan 1080Ti (from a used PC I bought). UserBenchMark had showed the card as underperforming average. I’ll run it again today though. So now I’m going for top of the line and then probably wont upgrade again for at least 2 years.
The X in Gaming X Trio means it’s OC. While Gaming Trio is not.
ah i was just looking at the clocks, i guess i missed the x thing, so it was the no x version that was shortly available (boost 1740mhz, not 1785mhz).
That’s interesting - if Linux can run up to 2100 MHz and press (pre-production) NVIDIA drivers don’t have the problem, but NVIDIA current Windows drivers are crashing past 2000, that seems like a driver problem to me.
Another good thing is that maybe top 3090 cards should be able to run up to 2100, and some water-cooling can make sense in that case. I was a little disappointed at der8auer water-cooling results…
So basically not much difference. I’m watching these things and its 2:30am lol
No, it seems like a driver issue in Windows only.
Different quality “caps” will of course play a role but apparently not a huge one.
Only tested with an Vive Pro but the conclusion is:
“We are surprised that the RTX 3080 – although Ampere has no VR optimizations over Turing – is able to give a superior VR experience at more demanding settings than the RTX 2080 Ti. We think that the RTX 3080 is a good upgrade over the RTX 2080 Ti for VR, even more so than for pancake gaming.”
Not sure, how good the site is and how the results will scale with an Pimax 5kx or 8kx. But I like that there is at least an benchmark using Elite in VR with the 30-Series.
Yeh, I took my time with 2080ti. I didn’t buy it until August 2019. So plenty of research available. I wasn’t at the time concerned about budget, I wanted the best, but still ‘bang for buck’ at the top end… a conundrum I know. Point is I did the research and would have paid $300 more at the time if it made sense. In the end I settled on the 2080ti MSI Gaming Trio X. From all the research I did it had the highest Terraflops count… on a like for like basis. Comparing 2080tis with 2080tis I am fine with. Same platform same card. I wouldn’t use that metric for AMD vs Nvidia, or perhaps even Nvidia vs Nvidia if it wasn’t the same card. There might have been a couple of exceptions - but I have been really happy with it. At that price point it was already so much money, paying the extra made sense. The 3090 though - I don’t see it. At least not yet. Early results on the Strix are appealing - but the cheapest I can find that for is $700 more than what I made for my Tuf. I can’t justify that. The only reason I am not going for the 3080 is the VRAM. I have seen some thoughtful comments by risa2000. I like his/her thought process, though not sure I agree with it, at least not totally. I have been pushing a Pimax 4k in VR sims… nice Res compared to the likes of Oculus (also own). Great for motion rigs since it didn’t require tracking - just uses a gyro. My experience is CPU is not even slightly taxed… on a AMD 3090x. GPU utilisation was finicky. VR is in general … most of us forget about the ASW… so sometimes, counter-intuitively - you gotta actually drop your graphics settings in order to get better consistent GPU utilisation. Think about it for a sec - if you put it to high - and ASW kicks in - bam - half the frames. GPU utilisation plummets. Gotta find that sweet spot. I guess its a rare use case where dropping the Graphics settings can increase GPU usage. Back to the strix - someone else posted the video with the young long haired fellow making the point about paying a bit extra. With the 2080ti I agreed, but that was a couple hundred extra (Aussie dollars - don’t want to hear about USA prices - lol) - $700 more for the strix… not at launch not for me at least. My understanding is Buildzoid reckons an 8 pin connector whilst only rated for 150Watts, can easily push double that. That removes from the equation that advantage of the strix. Silicon binning - who knows. Depends on the card you get. Changing bios / how cool you can keep it. I mean these ASUS cards are running nice and cool already - but stability gets finicky not just from clock or memory frequencies - but thermals. So did you get a good thermal case? Did you save the $700 and spend it on a better PSU and case to keep thermals down, maybe a noctua or ice giant fan? Some better case fans… you get the point - and still have a chunk of money left over to pair your fancy new GPU with the soon to be released AMD cpus. My thought process anyway. Apologies if I rambled. Just reread your post - I got lucky with mine - it might be because of all the other factors I mentioned psu/cooling/case etc - But mine performed in the top percentile. Just to be clear - I do not recommend the MSI gaming trio for the 3000 series. They in my opinion sat on their laurels this time round.
Yessir - finding good VR benchmarks is like looking for a needle in a needlestack… where the needle you are trying to find is in a different needlestack. I understand the Pimax 8K X is a very unique use case applying to basically a tiny percentage of the population - but gosh darn it - would be nice to try make some more informed purchasing decisions when so much money is being decided upon.
I dunno - der8auer is amazingly talented and intelligent - but a couple of tests does not make a worldwide production run test. There are so many videos showing a correlation between cards that do/don’t have that - but yeh - doesn’t matter which card it is - push it and it will become unstable, ultimately think he is probably right - bit of both the capacitors/the setup. We also gotta remember, Linus is going to get the best silicon they can possibly send him. Same with most of the reviewers, we need to pay attention to reviewers who actually are using retail cards… and even then the cards will vary widely even if they are the exact same make and model. That said, even with ‘reviewer quality’ cards - there seems to be a qualification. The whole thing is if 2000 MHz seems to be the general cap - and Strix can push that - which it seems it can - is it mainly down to the better cooling solution? It also has the 6 MLCC… Enough power appears to be getting through perfectly stable on the cards that tend to be collapsing around the 2000 mark… so… what other components do Strix have that are superior to say the Tuf… it has larger fins/heatsink… I am guessing that is a large part of it.
honestly, I don’t see any head start there. The 2080ti Founders with their joke cooler and the 250W Powerlimit is arguably the worst scaling card on the market, and runs about 25% below what it can actually do.
But thanks for the interesting link.
Yeah, need to not forget silicon lottery.
So, it seems the new patch Nvidia released solves all instabilities - without downclocking boost.
And it wasn’t the capacitors, because all cards were equally affected (Asus’ TUF as well). It’s also proven it weren’t even poscaps, not even on the Zotac Trinity card, but Spcaps. Anyway, the capacitor BS was irrelevant.
Crazy how half the internet got riled up, cringey that I had believed that as well.
Well, back to huntin’
… but igor has sayed… xD
250w is like crippling the 2080ti
More cores usually means lower clocks