AMD 6000 GPU series discussion

That was also the first thing I checked - the 6800 xt and 6900 xt are both 2.5 slot designs, only the 6800 non-xt is 2 slot. So the faster cards wouldn’t fit in my current case. Sniff, it’s a nice case.

2 Likes

You must have an ultra portable desktop. My 2080ti fits quite well. Doubt to have problems with an 3080ti or 6900xt

1 Like

Yepp, an A4-sfx (7,2l) which is smaller than many egpu cases, but for the full system - very nice! A 2080TI FE would still fit. The new high end hardware generation doesn’t seem to fit anymore though. Was considering the NR200 which looks interesting, inexpensive and would fit up to 3 slot cards and bigger CPU coolers. It is completely sold out since months though. Seems also to be a trend with current hardware…

3 Likes

Aw. I see. Mine is small enough and come with a handle. I could have gone smaller, but heat dissipation can be an issue. So I kept my portable on a mid size case. No problems so far. Will need to see what AMD size boards are currently on the market

4 Likes

Interesting that nobody talks about dlss here.
I‘m not sure if that technology may actually be a game changer.
This makes it super hard to compare those two. We‘ll see when we know more

2 Likes

I think is more about those that wanted to get a RTX 3090. Will have to wait and see the actual reviews

We have rtx 3000 discussion so let’s talk about new amd gpu.

Looks like they are better than 3080 and 3090 in performance and price. Only DLSS 2.0 saves nvidia from not drowning.

$1000 for 6900XT vs $1500 rtx 3090

What’s your opinion?

6 Likes

This.

Also my take from the presentation is that the infinity cache does its wonders especially good in lower resolutions, not higher. As seen in the comparison of the 2080ti to the 6800. The gap did widen from the first presented 4k numbers to the 1440p numbers presented directly after. I could be wrong with that though…

So if i am not wrong than especially with very high resolution vr gaming (what we all like) gddr6x cards will be better. More so the higher the ss.

I ordered the zotac 3090 (only available card for a few seconds) but cancled the order a day after, after educating myself that the zotac is a 350w limited card and the price is a shame (was a little over €2000). So with a pain in my heart i didn‘t accept the parcel today.

I try now to catch a 3090 founders somewhere. Whish me luck…

3 Likes

… hmmm… i can not really see amd ahead of nvidia. But lets wait for the real test numbers…

Edit: mixed up boost mode with rage mode

3 Likes

After listening of the infinity cache, it sounds in theory very similar idea to what the Amd Xenos architecture on the Xbox 360. Basically it was a very small amount of memory that was super fast and was aiding the system for graphic performance. I believe developers loves that architecture over the PS3 Cell + the RSX Nvidia system.

4 Likes

I still think the 3090 may be faster as a whole, but not much. It comes down to dollar ratio

2 Likes

Yep. Edram.

Than my assumptions are very likely true because the bigger the used vram, the smaller the percentage of the infinity cache, so the smaller the gain in performance…

3 Likes

That is the hunch i have

1 Like

Unfortunately not only that. Historicly the amd drivers and especially the vr support is terrible. At least much worse than nvidia.
But the prices (especially of the 6900xt) crushes nvidia… good for us consumers.

3 Likes

Something tells me AMD my try at least keep up on this generation. I will get a new GPU next summer. By then it will clear where both companies stand.

2 Likes

Hopefully amd uses its monetary relief to strengthen their software (driver-) division. They really need it

2 Likes

Agreed. I am still fine with my RTX 2080ti. I can afford being an observer for a bit. I am curious of the level of performance the new set of Mobiles also

2 Likes

Will be interesting how the machine learning based work in progress superresolution solution AMD mentioned in the presentation will work. So, whether it will be AMD propriety or not and more importantly, whether they manage to get a solution on the floor that generalizes over enough games that it doesn’t have to be specifically trained.
I have no clue how many VR games will still be written for PCVR exclusively to be honest. And DLSS, as great as it is, is currently an island solution that just works on PC and that needs every game to specifically be adapted to support it.

Probably the bigger elephant in the room is still CUDA (not so much for gaming but for HPC). A sector where NVIDIA makes tons of money lately and where AMD couldn’t get much tracktion yet. So let’s hope Nvidia still considers the game market “worth it” for the longer term future. A de facto monopolist would be bad, no matter whether it starts with A, N or I.

2 Likes

I think waiting for summer and see how it all pans out is the smart decision for those not in a terrible hurry.

I think dlss and amd’s version of it is a bit of a funny thing. back in the day the gpu producents got a lot of heat for sacrificing some graphic fidelity for fps, but today we’re ok with them selling 720p cards that kinda guestimate the rest of the image and calling them 8k capable? (hyperbole, i know but the general point stands)

2 Likes

As far as I understood it’s mostly remembering from past frames instead of guestimating( * ) - so the accumulated information about an object can indeed be higher than of a single high res image, so the upscaled image can actually contain more accurate details. Sounds pretty clever. And the part that makes me hope that they can generalize it at some point.
( * But what about elements that just entered the screen, are these then low detail for a few frames? Or guestimated? (But then it would “morph” as soon as the real details become known?) Or the rendered screen is bigger than the visible area, so each object already appeared in several frames before it becomes visible? Pretty impressive technology in any case!)