Same amount of shaders. new build process, therefore faster but less memory, so that the 2080TI is better again.
You guys do forget that the raytraicing performance will be twice as fast . We might even get raytraicing in vr games
FYI: Some of the rumors Iāve seen have said raytracing is 4x faster. We wonāt know for sure, until the cards are released.
Im sticking to my Nvidia upgrade religion:
Upgrade when needed and advancement is significant in the Top tier model.
Skip at least a full generation of GPUās
last example:
Upgraded from 2x GTX 580 in SLI heavily OCāed and H2O cooled.
Upgraded to GTX 1080Ti heavily OCāed and H2O cooled.
Itā seems to me that the advancements in Ampere is enough to qualify for me to upgrade again in the same manner to RTX 3090, and slap a full cover H2O block on that as well and OC the hell out of it.
DP2.0 port or not, that wonāt be holding me back, I skipped the RTX2080Ti and im happy with that decision.
Edit:
Not paying extra for Binned EVGA products, rather use the extra $$$ for H2O cooling.
Hehe almost snap
2x 460GTXs (until ED Horizons came out, they couldnāt render the planets properly and I kept falling through them )
980 Ti
2080 Ti
Now Iām in no hurry, gonna sit back and eventually buy whichever card is proven to let the 8KX work at maximum potential
Itās a good policy and one I intend to follow. Looking forward to the upgrade from my 1080 TI.
Apparently the 3080 is significantly faster than the 2080 Ti, and the 3070 is supposed to be on par.
But I guess weāll know for sure either way tomorrow!
Yes
I expect the 3080ti/3090 to be way overpriced but am hoping that the
3080 will be reasonably priced and could be my next card from 1080ti
wise decision I also have 1080TI water cooled, I missed VRS so much to try FFR & DFR yet, but wait was beneficial, I also plan to grab 3090 with 24GB, I guess we finally going in the right direction, eye tracking starts to be more & more videly available (Tobi, 7invesun other companies) we already have StarVR One, Vive Pro Eye, now Pimax Droolon, finally somebody did DFR VRS driver level support & SDK get spread among devs. Im insterested if Oculus provides any ET announcement soon as it also would be beneficial for ther mobile HMD to keep power even more aggresively + so anticipated varyfocal lenses.
Eh I would like to see new OLED HMD with higher density, HTC Vive Pro is really amazing, but there is still room for improve. Itās why I ordered 8k P2 (OG) to see how OLEDE pentile could look in Pimax is it better than Pro or at least the same level in terms of SDE.
So many things & new VR HMDs happening last time. Just remember Jan of 2020 when almost everybody was skeptical about new HMDs in 2020 (apart of already shown & Sony for 2021) as every major player shown all recent developments. And now we have G2, something not clear from Oculus, Pimax finally releasing, ET gets true, new video cards! Oh they know how to drain money from people
Ok thanks for explaining your reasoning.
Is it not possible that the shaders are better/doing more work in the new cards? I just wonder why none of the techtubers Iāve seen discussing it have come to the same conclusion.
If the performance increase (if 3080=2080Ti, 3090 probably wonāt be much faster?) is as minimal as you suggest, do you think AMD will come out on top this gen? Or do you maybe believe Nvidia is doubling down on raytracing performance, so that instead of pushing higher resolution at higher frame rates, you can raytrace without dropping resolution?
Mindblown oO ! When we can preorder it ? Btw I dont believe in 10k cuda cores . Its 5248(X2 ) somehow( Double fp32 per shader ) --> the result of it we dont know actually in non rt Games .
Metro Exodus should run in 4k with maxed raytraicing with 60 fps though I think instead of 25-30 fps like we know have on 2080 ti which is amazing
It sounds dodgy, but surely they canāt straight up lie about the core count number? I thought maybe that was an asterisk near the CUDA until I zoomed in.
ā¦ You know Youāre on the Pimax forums when You see a question like thatā¦
Wonāt the new āBig Naviā AMD video cards support display port 2.0?
Of course, that will matter very little if the new GPU isnāt competitive with Nvidea. Or will it? Would the increased bandwidth make up for it?
nVidia has changed the cores, so that they can perform 2x 32-bit operations per clock. Thatās a very fine line and Iām not sure their marketing department is being honest. The concept is clear: Each raster core is twice as powerful, but the number of cores should not be double-counted.
RDNA2 will likely be competitive with Nvidia, which is exactly the reason why Nvidia halved the price of the flagship xx02 chip. Will RDNA2 support DP 2.0? We donāt know yet. My best guess is thereās about a 50% chance that it does.
Reasons why yes:
AMD is historically ahead of Nvidia in adopting new technologies.
The spec was finalized last year, which is enough time to implement it on a new GPU.
Reasons why no:
If itās so easy and cheap, why didnāt Nvidia do it too?
There isnāt much use for it right now, except for future proof VR and 8K gaming (which will still be difficult to drive at high refresh rates because of performance)
Another hypothesis: Even though nice to have, the lack of DP 2.0 is something that wonāt stop too many to still buy the current card generation in 2020. And as both Nvidia and AMD might have to go (closer to) all in (as usual) to counter the competitor, the performance step in the next card generation might be smaller. So DP 2.0 could be saved as a convincing argument to make people still buy those, particularly when coordinated with next gen Gsync/Freesync monitor releases.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.