So I just realized I'm a sucker (may have overpaid for my Ryzen 5800)

Ha ha! Just checked microcenter and today they have +25 5600x in stock today. They do go fast though so I need to get off my a$$ and get down there

2 Likes

So I dived into 5800X benchmarks and noticed one thing, which I was not aware of. There seems to be a certain discrepancy in thermal dissipation between 5600X vs 5800X. In particular it looks like 5800X needs more power (and thus produces more heat) per core, to keep the same performance as 5600X (per core). So while it is definitely an advantage to have all 8 cores on one chiplet, it seems there is a price to pay for that and I am now not sure myself if I want that trade-off.

2 Likes

The reason the 5800 has a relatively high markup is because AMD really would much rather sell you two ccx’s of 6 cores in the 5900 that the one 8 core ccx on the 5800, the complete all core ccx will have a much lower yield than the lower ones where they just need most of the cores functional.

There are advantages to having the 5800 with regards to latency but if it’s enough to prefer it over the 5900 with more cores depends on your workload.

2 Likes

lol…no thanks I have enough VAT from Pimax.

3 Likes

My guide to upgrading:

What software do I need to run?
What fidelity do I need to run it at?
What hardware do I need to achieve that?
Is it a viable investment at this time in my gaming addiction? :grinning:

2 Likes

as long as you dont want to overclock its not that important, most of the time the cpu will ildle and the question of how much power it consumes not the point as mostly it only runs less then 2h a day with full power (at least with most people)

2 Likes

are you sure you will even recognize any difference with that “upgrade”?
having 2 cores less and slightly more performance (if any) might only influence very few (badly programmed) software that relies on 1-4 cores
in most cases the gpu will limit the performance not the cpu (only these insane 1080p benchmarks with >200 fps might seen some improvements with a faster cpu, when doing 1440p it will be more likely gpu limit and with 4k its always the gpu that holds you back)
i think a 2700x is still good an the investment with these price hikes (low availability and just before christmas) are not worth it atm

2 Likes

Pretty much true which is why haven’t upgraded.2700x and 2080ti still doing solid job for me👍

2 Likes

Welcome to the suckers club !

2 Likes

How much faster is a 2700x than a 4790.

I pretty much never upgraded for 6 years because of the same reasons you state.

I’m only uograding now because I would feel a bit ridiculous pushing a 3090 or higher GPU in the future on a 4790 and when that time came i’d rather just have to weather the finances of a GPU and not a whole pc + GPU.

I probably wont feel the need to upgrade my cpu again for 6 more years. By then DDR5 will be the standard and we’ll be rocking 8TB nvme’s and pcie5

lol

From what I saw in some benchmarks, in particular:

the difference between 2700X and 5600X is quite significant in everything except 4K gaming, where both are GPU limited.

2 Likes

No I would say one whom had done his research. And if not mistaken he is also a DeV.

Amd since the Ryzen 2000 series has been pulling out the stops. You need to do more research with an open mind.

3 Likes

in most cases even 1440p for VR with 90Hz (or more) will be a GPU limit
in that scenario a 5600x vs. 2700x might give you a advantage of 10% in fps

https://www.guru3d.com/articles_pages/amd_ryzen_5_5600x_review,24.html

selling a 2700x now (if possible) and buying a 5600x will cost 300-400€
depends on the gpu but with a normal 1080 you might be better off using the money for a better gpu
with a 2080Ti the gpu is hardly to improve with just 300-400€ to spare
on pure cpu multicore usage (calculations) the win for a 5600x is not that decisive

with already having a pretty high end gaming system (32GB RAM, 2700x and 2080ti) waiting 2-3 month to have less hype and better prices (maybe even more options) is more wisely, you will not gain that much now for the same money

i guess 8kx owners are more in need of additional performance then 5k+ owners

1 Like

@drowhunter

My friends

I am eager to begin my VR adventure with Pimax 8K X and I have to assemble a new PC

I have checked today the supplier and i am the next in the list for My extreme 3090

3090 is better than competition. The speed of AMD card is for the higher clock compared to 3090.
But 3090 has 10496 fp32 cuda cores while 2080 ti has a total of 4352 core. To put this in perspective 3090 is like a gaming console. At the beginning the code of the games is not using the potential of the graphic board. Turing architecture SM have been optmized for int 32 execution. The code of today’s games is a mix of FP/INT instructions. This mix will change towards Fp32 execution because of RT.
10496 is much more than 5120 (amd 6900 xt) .

Speaking of CPU and motherboards to buy a new tech is always a risk.
But if nobody would buy an old GPU when there is the new one
For cpu and motherboards it’s better to check compatibility and issues

In this moment there are issues between new AMD Chipsets and G2.
Instead I have in hand an Intel stable special “VR motherboard” as i call it

AMD CPU and GPU has Smart Access Memory.
This probably is coming also to INTEL/NVIDIA

AMD is a PCI 4.0 platform but this do not give advantage in games, perhaps 1%
To have lightning hd speed you need compression in games (end 2021/2022)
Ampere will support this compression tech.

So when i see that INTEL 10900K (10 core) and amd 5900X (12 core) ARE HEAD TO HEAD
IN GAMES i like the INTEL-NVIDIA platform much more

All you want to Know is here

But when you take in consideration DLSS, Ray Tracing, Cyberpunk 2077 and VR bencks
INTEL +NVIDIA WINS WITHOUT DOUBT

So i have two ways.

I will buy all remaining Hardware Pieces next month after CES 2021
or i have to wait other two months just for the hell to make a 6 Ghz overclocked PC
with Intel Rocket 11900 K

1 Like

This is exactly why DeVs need to properly utilize Amd.

The G2 is interesting that it specifically has an issue on Amd while other headsets are fine thus far. So what did HP change that is causing the problem?

3090 is a top card as it is not really targetting consumers at it l’s released price point.

Why bother with a 3090 when you could go with one of the cards meant for powering road side TVs at $4000 :us: plus?

With Amd Gpu New line need to wait and see what comes. The Console though has shown some nice gains on the new consoles with initial RT on them. Posted a nice video on it in the Amd gpu discussion.

As for pciX 4.0? As GamerNexus already I believe said we’re atm still not peaked on PciX 3.0. Main core at present is for Storage drive speeds.

GamerNexus did a test quite awhile ago comparing x8 vs x16 and the difference was not large at the time.

BTW - Whatever happened with your enthusiasm of the SMAS?

We have for VR Quest 2, Index, G2, Pimax

Quest is untethered vr gaming. When you use the cable, G2 and Pimax have Better quality image

No one could like Index more than G2 . Same, G2 and Pimax have better quality image

So, considering compatibility and Image quality IT’S IMPOSSIBLE TO BUY AMD TO MAKE VR
because HP and Pimax have a low support of AMD Hardware until now

Yes but 3080 in VR is more fast than AMD competition

Speaking of Gaming with a monitor, again, there were always AMD GAMES and NVIDIA GAMES
however there are (not gaming) benchmarks showing what 3080/3090 can do compared to 2080 ti
with 10496 fp32 cores (3090) etc far superior results to what we see in games. If these results will be trasferred to next generation games (titles we don’t know the name yet) all today benckmarks are waste paper because you have nothing under the table (hidden) in RDNA2 . What you see now is what you get from AMD

Also, it’s impossible that Rocket lake go slowly compared to comet lake. Yes multicore test could see comet lake winning but in gaming and especially in VR (where the clock is the most important thing) Rocket lake will be the King until Golden Lake in 2022 (Only VR)

Now that’s misinformation. I have been using my Amd R9 390 8g since pimax release. Yes Rx480 to Vegas had issues.

Amd Cpu on the other hand no real problems. @bubbleball was using an FX8350 and I as well as many others have been using Ryzen Cpus.

Both Amd and pimax have improved support for later GPUs.

So no it is not true you can’t vr on Amd.

In Fact when WMR released Amd APUs performed considerably better than Intel with igpu. It was playable on Apu and not so much on Intel Igpu.

G1 was working on Amd so it would be interesting to see if the g1 has the issue that tye g2 has on latest chipset you mentioned.

My quick search is only showing an issue with the new Amd 6000 series graphics cards with HP.

2 Likes

You might want to recheck benches. Plus were still waiting on Amd to roll out it’s DLSS and DXR on pc platform. Otherwise benches show that Amd is doing fine against 3080 overall.

The 3080ti will likely be a different story.

Interesting your Article says this


Game Benchmarks

With these new insights I configured our CPU testing platform to DDR4-3800 and installed a GeForce RTX 3090. I then ran through our whole games list for the CPU bench, with surprising results.

There are many charts—I’ll describe the highlights so you know what to look for.

First of all, one of the most important discoveries is that 720p with the RTX 2080 Ti isn’t completely CPU limited in many titles. If it were, we would only see minimal differences in FPS between 2080 Ti and 3090. This does only happen in Sekiro, a little bit in Wolfenstein, Far Cry and others, but not to the degree we expected.

Another takeaway is that at higher resolution, when more GPU bound, the difference between AMD and Intel is almost negligible. Good examples of that are Wolfenstein 4K, Witcher 4K, and Tomb Raider 1080p and up, with other titles showing similar patterns at 4K mostly.

The last and most important result is the highlight of Zen 3—AMD has Intel beat in gaming performance, at least with fast memory at 3800CL16 and an Ampere graphics card


Suggesting “Amd Cpu+Nvidia” is trumping “Intel+Nvidia”

1 Like

About Pimax and Amd you can read here:

https://www.reddit.com/r/Pimax/comments/dyyy8d/pimax_5700_xt_compatibility/

You know many times i asked what configuration had the user

I have found issues with games in the configuration Amd Cpu+Nvdia not existing with Intel+Nvidia

I have not found the opposite.

Amd not good for VR.

Period Mark.