CPU Cache vs. Cores: AMD Ryzen Edition

It would be more useful if the clockrate was leveled.

I don't understand the reason for limiting yourself to bringing data that we already know.
 
As I noted years ago: There's a point where adding more cores is no longer going to improve performance simply due to the impossibility of making the code more parallel. And it's not shocking at all the majority of titles start hitting that wall in the 6-8 core range. After that you're chasing incremental gains (and can even see minimal loss due to cache coherency and scheduling concerns).

So yeah, these new 12x2 core CPUs that come out at the high end are great...but you really don't need them then a basic 8-core with a ton of cache will run circles around them in the majority of tasks.
 
As I noted years ago: There's a point where adding more cores is no longer going to improve performance simply due to the impossibility of making the code more parallel. And it's not shocking at all the majority of titles start hitting that wall in the 6-8 core range. After that you're chasing incremental gains (and can even see minimal loss due to cache coherency and scheduling concerns).

So yeah, these new 12x2 core CPUs that come out at the high end are great...but you really don't need them then a basic 8-core with a ton of cache will run circles around them in the majority of tasks.
More cores help to compile shaders quickly tho
 
Very interesting! I was thinking of going for a 5800X3D from my 5600X but given this and the 5600X3D review, I will go for the 5600X3D... next time I happen to drive by a Micro Center.

Thank you Steve
 
As a side note: if it's 6c/12t and 8c/16t, it would be better to show that on the diagrams. Because, 6c/6t != 6c/12t in gaming. Especially considering AMD SMT implementation. Thats probably why 12t is enough for gaming still.
--------

PS 6c/12t were comparable to 8c/8t, as far as I can recall. But it's the story for another article, I guess.
 
Last edited:
Another point to show that AMD is doing a good job with their CPUs, hopefully one day they can do that well with their GPUs.
 
Excuse me, but what GPU are you using with the 5600X, and what games are you usually playing?
I am using a 4070 Ti - I play mostly action games but BG3 lately. Cyberpunk.... it's kind of a mess with this setup. I am hoping the new CPU will help with stuttering/1% lows. But knowing me, I'll get obsessed with some 2d game and then it won't even matter haha, Heat Signature will be the death of me.
 
So let's downgrade our monitors to 1080p to put our 4090's to good use on our EOL systems :)

I'm just kidding. I know why this is done this way. At least the 720p benchmarks are gone with the 4090. Still, those tests seem to produce interesting results until they meet with reality, where Intel 10th gen Comet Lake and AMD Zen3 rigs are rarely matched with the highest rank GPU.

And even if you do, the natural habitat for a high cost RTX 4090 is a high cost 4k/5k monitor and maybe, just maybe high hz WQHD (but then you most likely overpaid on your GPU).

Anyway, if you bench for reality, those differences become narrow and boring very fast. So for me this is not an interesting test. Your mileage may vary.

 
So let's downgrade our monitors to 1080p to put our 4090's to good use on our EOL systems

I'm just kidding. I know why this is done this way. At least the 720p benchmarks are gone with the 4090. Still, those tests seem to produce interesting results until they meet with reality, where Intel 10th gen Comet Lake and AMD Zen3 rigs are rarely matched with the highest rank GPU.

And even if you do, the natural habitat for a high cost RTX 4090 is a high cost 4k/5k monitor and maybe, just maybe high hz WQHD (but then you most likely overpaid on your GPU).

Anyway, if you bench for reality, those differences become narrow and boring very fast. So for me this is not an interesting test. Your mileage may vary.
Nice story bro.

I found this test very interesting.
 
Back