No Display Port 2.0 on Nvidia Ampere 3090 šŸ¤Ø šŸ§?

Same amount of shaders. new build process, therefore faster but less memory, so that the 2080TI is better again.

1 Like

You guys do forget that the raytraicing performance will be twice as fast . We might even get raytraicing in vr games

FYI: Some of the rumors Iā€™ve seen have said raytracing is 4x faster. We wonā€™t know for sure, until the cards are released.

1 Like

Im sticking to my Nvidia upgrade religion:
Upgrade when needed and advancement is significant in the Top tier model.
Skip at least a full generation of GPUā€™s

last example:
Upgraded from 2x GTX 580 in SLI heavily OCā€™ed and H2O cooled.
Upgraded to GTX 1080Ti heavily OCā€™ed and H2O cooled.

Itā€™ seems to me that the advancements in Ampere is enough to qualify for me to upgrade again in the same manner to RTX 3090, and slap a full cover H2O block on that as well and OC the hell out of it.
DP2.0 port or not, that wonā€™t be holding me back, I skipped the RTX2080Ti and im happy with that decision.

Edit:
Not paying extra for Binned EVGA products, rather use the extra $$$ for H2O cooling.

6 Likes

Hehe almost snap

2x 460GTXs (until ED Horizons came out, they couldnā€™t render the planets properly and I kept falling through them :smile:)
980 Ti
2080 Ti

Now Iā€™m in no hurry, gonna sit back and eventually buy whichever card is proven to let the 8KX work at maximum potential :+1:

3 Likes

Itā€™s a good policy and one I intend to follow. Looking forward to the upgrade from my 1080 TI.

1 Like

Apparently the 3080 is significantly faster than the 2080 Ti, and the 3070 is supposed to be on par.

But I guess weā€™ll know for sure either way tomorrow!

1 Like

Yes
I expect the 3080ti/3090 to be way overpriced but am hoping that the
3080 will be reasonably priced and could be my next card from 1080ti

1 Like

wise decision I also have 1080TI water cooled, I missed VRS so much to try FFR & DFR yet, but wait was beneficial, I also plan to grab 3090 with 24GB, I guess we finally going in the right direction, eye tracking starts to be more & more videly available (Tobi, 7invesun other companies) we already have StarVR One, Vive Pro Eye, now Pimax Droolon, finally somebody did DFR VRS driver level support & SDK get spread among devs. Im insterested if Oculus provides any ET announcement soon as it also would be beneficial for ther mobile HMD to keep power even more aggresively + so anticipated varyfocal lenses.

Eh I would like to see new OLED HMD with higher density, HTC Vive Pro is really amazing, but there is still room for improve. Itā€™s why I ordered 8k P2 (OG) to see how OLEDE pentile could look in Pimax is it better than Pro or at least the same level in terms of SDE.

So many things & new VR HMDs happening last time. Just remember Jan of 2020 when almost everybody was skeptical about new HMDs in 2020 (apart of already shown & Sony for 2021) as every major player shown all recent developments. And now we have G2, something not clear from Oculus, Pimax finally releasing, ET gets true, new video cards! Oh they know how to drain money from people :sweat_smile:

2 Likes

Ok thanks for explaining your reasoning.

Is it not possible that the shaders are better/doing more work in the new cards? I just wonder why none of the techtubers Iā€™ve seen discussing it have come to the same conclusion.

If the performance increase (if 3080=2080Ti, 3090 probably wonā€™t be much faster?) is as minimal as you suggest, do you think AMD will come out on top this gen? Or do you maybe believe Nvidia is doubling down on raytracing performance, so that instead of pushing higher resolution at higher frame rates, you can raytrace without dropping resolution?

Thoughts after seeing the specs?

3 Likes

Mindblown oO ! When we can preorder it ? Btw I dont believe in 10k cuda cores . Its 5248(X2 ) somehow( Double fp32 per shader ) --> the result of it we dont know actually in non rt Games .
Metro Exodus should run in 4k with maxed raytraicing with 60 fps though I think instead of 25-30 fps like we know have on 2080 ti which is amazing

3 Likes

It sounds dodgy, but surely they canā€™t straight up lie about the core count number? I thought maybe that was an asterisk near the CUDA until I zoomed in.

ā€¦ You know Youā€™re on the Pimax forums when You see a question like thatā€¦ :rofl:

5 Likes

Wonā€™t the new ā€œBig Naviā€ AMD video cards support display port 2.0?
Of course, that will matter very little if the new GPU isnā€™t competitive with Nvidea. Or will it? Would the increased bandwidth make up for it?

nVidia has changed the cores, so that they can perform 2x 32-bit operations per clock. Thatā€™s a very fine line and Iā€™m not sure their marketing department is being honest. The concept is clear: Each raster core is twice as powerful, but the number of cores should not be double-counted.

3 Likes

RDNA2 will likely be competitive with Nvidia, which is exactly the reason why Nvidia halved the price of the flagship xx02 chip. Will RDNA2 support DP 2.0? We donā€™t know yet. My best guess is thereā€™s about a 50% chance that it does.
Reasons why yes:
AMD is historically ahead of Nvidia in adopting new technologies.
The spec was finalized last year, which is enough time to implement it on a new GPU.
Reasons why no:
If itā€™s so easy and cheap, why didnā€™t Nvidia do it too?
There isnā€™t much use for it right now, except for future proof VR and 8K gaming (which will still be difficult to drive at high refresh rates because of performance)

2 Likes

Another hypothesis: Even though nice to have, the lack of DP 2.0 is something that wonā€™t stop too many to still buy the current card generation in 2020. And as both Nvidia and AMD might have to go (closer to) all in (as usual) to counter the competitor, the performance step in the next card generation might be smaller. So DP 2.0 could be saved as a convincing argument to make people still buy those, particularly when coordinated with next gen Gsync/Freesync monitor releases.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.