Considering 14th Gen is a refresh, 9700X is literally fighting 2 year old tech here and still loose.

It makes no sense anyway to buy a non-3D chip for gaming. Which is one of AMDs big problems;

Dual CCD is mediocre for gaming. Inter CCD latency and clockspeed diff between CCDs.

Non-3D is not really good for gaming.

3D is good for gaming and nothing else, due to gimped clockspeeds (lets hope 3D cache will be less sensitive next time and 9000X3D will fix this, with increased clockspeeds to follow). AMD should be using 3D cache on both CCDs for 9900X3D and 9950X3D but they sadly won't. Cheaped out again. Lost oppotunity.

Also, AMD still sits at just 6000 MT/s where Arrow Lake will do 8000-9000 even 10000 MT/s.

Hell some people can't even hit 6000 stable on Ryzen 9000... Just like many 7000 owners can't hit 6000, even tho AMD calls this the sweet spot (6000/30).

Intel 285K, 265K and 245K is going to be highly popular for people that want good all-round performance and TSMC 3N will fix the power draw issues for sure. Also HT is removed which lowers temps/watt usage allowing for higher clockspeeds on the real cores.

Both ST and MT goes up with Arrow Lake, and watt usage drops. Alot.

Intel simply delivers better all-round performance. This was true for Alder Lake, Raptor Lake and refresh as well. The problem tho, for some, was power usage. This will be fixed with TSMC 3N and the chips will have OC headroom too. Intel is not forced to run aggressive clockspeeds due to having node advantage now.

AMD will regret they used 5nm for Ryzen 9000 very soon. They probably already regrets it considering Zen 5 reviews were mediocre. Zen 5 was a hyped up, brand new arch, built ground up, and failed.

Perfect timing for Intel really. Lunar Lake and Arrow Lake is weeks away, and uses next gen 3nm node.
 
Last edited:
Considering 14th Gen is a refresh, 9700X is literally fighting 2 year old tech here and still loose.

It makes no sense anyway to buy a non-3D chip for gaming. Which is one of AMDs big problems;

Dual CCD is mediocre for gaming. Inter CCD latency and clockspeed diff between CCDs.

Non-3D is not really good for gaming.

3D is good for gaming and nothing else, due to gimped clockspeeds (lets hope 3D cache will be less sensitive next time and 9000X3D will fix this, with increased clockspeeds to follow). AMD should be using 3D cache on both CCDs for 9900X3D and 9950X3D but they sadly won't. Cheaped out again. Lost oppotunity.

Also, AMD still sits at just 6000 MT/s where Arrow Lake will do 8000-9000 even 10000 MT/s.

Hell some people can't even hit 6000 stable on Ryzen 9000... Just like many 7000 owners can't hit 6000, even tho AMD calls this the sweet spot (6000/30).

Intel 285K, 265K and 245K is going to be highly popular for people that want good all-round performance and TSMC 3N will fix the power draw issues for sure. Also HT is removed which lowers temps/watt usage allowing for higher clockspeeds on the real cores.

Both ST and MT goes up with Arrow Lake, and watt usage drops. Alot.

Intel simply delivers better all-round performance. This was true for Alder Lake, Raptor Lake and refresh as well. The problem tho, for some, was power usage. This will be fixed with TSMC 3N and the chips will have OC headroom too. Intel is not forced to run aggressive clockspeeds due to having node advantage now.

AMD will regret they used 5nm for Ryzen 9000 very soon. They probably already regrets it considering Zen 5 reviews were mediocre. Zen 5 was a hyped up, brand new arch, built ground up, and failed.

Perfect timing for Intel really. Lunar Lake and Arrow Lake is weeks away, and uses next gen 3nm node.
Hell no. 14700k gets 13% better perf for 104% more power. This is the difference in price between using a single tower air cooler to an AIO. Also, the 9700x does not crash unlike the Intel counterpart.
"bUt mUltIcOre" Here where I live the 14700k is priced similar to the also similar performing 7900x. Anyone who wants real productivity would be smart enough to go 7900x instead of Ryzen 7.

Memory timings? The faster your memory is the higher your latency is and the higher the cost. Given I'm kicking an i5 6500 in an Elitedesk, I just need something that is sufficient for the next 5 years. DDR4-3200 is still perfectly fine, and I'm sure DDR5-6000 is going to be absolutely fine in the next few years

Next gen Intel is rumored to be slightly slower than 14700k, but uses far less power. If that translates to an equal or better performace and given a good price, than the 9700x would be bad value. If not, 9700x on top as I'm certain the Ryzen 7 will get price cuts.

You are still right to a degree when you say both chips are bad value though.
 
Hell no. 14700k gets 13% better perf for 104% more power. This is the difference in price between using a single tower air cooler to an AIO. Also, the 9700x does not crash unlike the Intel counterpart.
"bUt mUltIcOre" Here where I live the 14700k is priced similar to the also similar performing 7900x. Anyone who wants real productivity would be smart enough to go 7900x instead of Ryzen 7.

Memory timings? The faster your memory is the higher your latency is and the higher the cost. Given I'm kicking an i5 6500 in an Elitedesk, I just need something that is sufficient for the next 5 years. DDR4-3200 is still perfectly fine, and I'm sure DDR5-6000 is going to be absolutely fine in the next few years

Next gen Intel is rumored to be slightly slower than 14700k, but uses far less power. If that translates to an equal or better performace and given a good price, than the 9700x would be bad value. If not, 9700x on top as I'm certain the Ryzen 7 will get price cuts.

You are still right to a degree when you say both chips are bad value though.
I don't really care much about power usage.
We are talking performance here, and nothing else.

"Faster in productivity than any other AMD CPU"

Where I live, 14700K is priced on par with 9700X and 7800X3D. 14700K absolutely destroys them in productivity and gaming performance is similar.

Memory speed matters for tons of workloads and games.

Next gen Intel 285K is not rumoured to be slower than 14700K LMAO. Rumours shows it beats 14900K easily by 10-15% in both ST and MT while using alot less power.
 
How does it "loose" [sic]? It provides the same or better performance at less than half the Watts lol.
Less performance means losing. To a 2+ year old arch. Zen 5 is a brand new arch.

14700K uses 113w average in gaming
9700X uses 71w average

No-one would notice the difference

14700K uses the same power in gaming as 7950X, 14700K beats it tho.


 
Last edited:
I'd imagine for the 99% of gamers that don't own a 4090 that it doesn't matter what CPU you own as long as it was produced somewhere in living memory.
 
I'd imagine for the 99% of gamers that don't own a 4090 that it doesn't matter what CPU you own as long as it was produced somewhere in living memory.
Most serious/competitive gamers are CPU bound, not GPU bound, especially people using high refresh rate monitors pushing high fps.

Most 4090 owners are high-res gamers, meaning they are mostly GPU bound.

So its the other way around.
 
People always mention AMDs lower power usage, however, AMD generally uses more in idle. Explain?

9700X uses 78w
14700K uses 59w

AMD uses 20-25 watts more than Intel, at all times, when you don't push the chip

 
Last edited:
People always mention AMDs lower power usage, however, AMD generally uses more in idle. Explain?

9700X uses 78w
14700K uses 59w

AMD uses 20-25 watts more than Intel, at all times, when you don't push the chip


I guess the AMD chipset uses more at idle, however once you start gaming, the efficiencies of the CPU overcome the higher idle wattage? So it really depends on your use case. If you turn your PC on and start gaming, and then turn it off, AMD win? All just assumptions.
 
I guess the AMD chipset uses more at idle, however once you start gaming, the efficiencies of the CPU overcome the higher idle wattage? So it really depends on your use case. If you turn your PC on and start gaming, and then turn it off, AMD win? All just assumptions.

But who cares about power usage really, if its 60w or 80w, 90w or 130w.
Pretty much no-one in reality. For GPU bound gamers, CPU load is very limited too.

Besides, Intel will use TSMC 3N next. Lunar and Arrow Lake lauching in just a few weeks. Power draw will be vastly lower.

However, 14700K which is pretty much a 13700K, beats a 9700X overall in gaming. 13700K was released in 2022, 9700X was released recently and was a hyped up, brand new arch, built ground up - Not really impressive, or do you disagree?

When looking at application perf, 13700K/14700K destroys a 9700X as well. More power? Yep, but also way more performance.

Intel simply deliver better all-round performance. The con is power usage, for now, should be fixed shortly with TSMC 3N. AMD lost node advantage. Personally I look forward to what AMD will do - Lower prices as usual I guess? After all TSMC 4N aka 5nm is around 50% cheaper than TSMC 3N aka 3nm. Even with Intels trouble, they can still afford to use top tier nodes at TSMC. AMD can't.

Zen 5 price was already adjusted once tho, 50 dollar drop on average.
 
Last edited:
Interesting article!

Question though: did you test with a pre-release version of MSFS 2024, or was this a typo? MSFS 2024 hasn’t been released yet.
 
I'd imagine for the 99% of gamers that don't own a 4090 that it doesn't matter what CPU you own as long as it was produced somewhere in living memory.

Correct. If you look at other websites that also show 4k results, my 5600 is 5% slower. Even a 3600 is a respectible 11% slower. This is all with a 4090 that 99.9% of games don't have. Margins will be much slimmer with most being in between a 4070 and 7900XTX.

Would I build an entire new system with new mobo/cpu/DDR5 just for 5%? Honestly, who would? Rather get a nice OLED monitor or a new laptop. Or spend it on something completely different. My 1600 lasted me 5y and replaced it with a 5600 almost 2y ago. I don't see any need to replace it for another 2-3y again. My mobo (Taichi x370) will be 10y old when I retire it. What do mobos offer now that mine doesn't have? I only see USB4 and useless PCIe gen updates. PCIe gen 3 is still not a bottle neck even for a 4090.

These CPU gens are such minor upgrades. It's like cellphones. They have to pump up the numbers by using a 4090 at 1080p.
 
Correct. If you look at other websites that also show 4k results, my 5600 is 5% slower. Even a 3600 is a respectible 11% slower. This is all with a 4090 that 99.9% of games don't have. Margins will be much slimmer with most being in between a 4070 and 7900XTX.

Would I build an entire new system with new mobo/cpu/DDR5 just for 5%? Honestly, who would? Rather get a nice OLED monitor or a new laptop. Or spend it on something completely different. My 1600 lasted me 5y and replaced it with a 5600 almost 2y ago. I don't see any need to replace it for another 2-3y again. My mobo (Taichi x370) will be 10y old when I retire it. What do mobos offer now that mine doesn't have? I only see USB4 and useless PCIe gen updates. PCIe gen 3 is still not a bottle neck even for a 4090.

These CPU gens are such minor upgrades. It's like cellphones. They have to pump up the numbers by using a 4090 at 1080p.

Same. From a 1600 to a 5700X on a b450 I got back in 2018. Only a couple of years, not going anywhere anytime soon. And I do care about wattage, no way I'm moving from my 65 watt preference.
 
AMD uses 20-25 watts more than Intel, at all times, when you don't push the chip
Thats only true if you use non-stock memory settings. Loose 3-5% of performance and have decent idle values.
 
My 1600 lasted me 5y and replaced it with a 5600 almost 2y ago. I don't see any need to replace it for another 2-3y again.
I actually went from an i5 3570K to a 7600. In all honesty, the i5 was fast enough for my development projects, but the surrounding hardware was starting to fail and updates wouldn't run. I've retired now and I don't expect to upgrade again.
 
Most serious/competitive gamers are CPU bound, not GPU bound, especially people using high refresh rate monitors pushing high fps.

Most 4090 owners are high-res gamers, meaning they are mostly GPU bound.

So its the other way around.

It would be very cool if they did an article about where bottlenecks fall in the current gen hardware.
 
I don't really care much about power usage.
We are talking performance here, and nothing else.

"Faster in productivity than any other AMD CPU"

Where I live, 14700K is priced on par with 9700X and 7800X3D. 14700K absolutely destroys them in productivity and gaming performance is similar.

Memory speed matters for tons of workloads and games.

Next gen Intel 285K is not rumoured to be slower than 14700K LMAO. Rumours shows it beats 14900K easily by 10-15% in both ST and MT while using alot less power.
If their prices are similar then go 7800x3d. 14700k is as of now something to avoid at all costs, and the 7800x3d blazes ahead of the 14700k. Steve did a review comparing the i9 to the 7800x3d, and the latter beat the former easily.

And we are talking about 265K, and I got this from videocardz.net. Again, nothing can be confirmed before Intel proves it.
 
I guess the AMD chipset uses more at idle, however once you start gaming, the efficiencies of the CPU overcome the higher idle wattage? So it really depends on your use case. If you turn your PC on and start gaming, and then turn it off, AMD win? All just assumptions
Less performance means losing. To a 2+ year old arch. Zen 5 is a brand new arch.

14700K uses 113w average in gaming
9700X uses 71w average

No-one would notice the difference

14700K uses the same power in gaming as 7950X, 14700K beats it tho.


Less performance means losing. To a 2+ year old arch. Zen 5 is a brand new arch.

14700K uses 113w average in gaming
9700X uses 71w average

No-one would notice the difference

14700K uses the same power in gaming as 7950X, 14700K beats it tho.


113w in gaming is an almost best case number. 14700k easily tops 180w.
No one buys a 7950x for gaming, and the 7950x flogs the i7 in productivity, despite having 4 less cores. Oh, and the 7950x does not commit suicide under the manufacturer's limits that were a horrible mess. That should be an auto win for anything.

If the new Lion Cove and Skymont cores can beat AMD, good for Intel. But as of right now, Intel has a lot of proving it needs to do.
 
In the end I expect the new CPUs from intel to deliver single digit gains in games which should put it around or slightly below the current x3D CPUs.

AMD really needs to lower prices for Zen 5 but I think they'll wait for Zen 4 stocks to dry out first.
 
113w in gaming is an almost best case number. 14700k easily tops 180w.
No one buys a 7950x for gaming, and the 7950x flogs the i7 in productivity, despite having 4 less cores. Oh, and the 7950x does not commit suicide under the manufacturer's limits that were a horrible mess. That should be an auto win for anything.

If the new Lion Cove and Skymont cores can beat AMD, good for Intel. But as of right now, Intel has a lot of proving it needs to do.
You know what average mean, right?

Those numbers were average.
 
If their prices are similar then go 7800x3d. 14700k is as of now something to avoid at all costs, and the 7800x3d blazes ahead of the 14700k. Steve did a review comparing the i9 to the 7800x3d, and the latter beat the former easily.

And we are talking about 265K, and I got this from videocardz.net. Again, nothing can be confirmed before Intel proves it.
14700K might loose slightly in gaming, but destroys 7800X3D outside of gaming, you know, some people want good all-round performance, not just good performance in gaming -or- applications.

Even in gaming, 14700K is only like 5-10% slower in 1080p and in 4K/UHD there's pretty much no difference.

Who cares tho, Arrow Lake hits on 3nm in a few weeks. This is what AMD have to compete with, not 2 year old arch like 13th and 14th gen.

The fact that 9700X loses to 14700K in gaming is a massive fail if you ask me. Considering 9900X and 9950X are worse for gaming.

AMD needs 9000X3D to compete, but will still loose in produtivity. 3D chips are mediocre for productivity, and non-3D chips are mediocre for gaming.

See, AMD can't deliver all in a single chip. You have to choose between gaming and produtivity performance. That is their big problem. 7800X3D is a great gaming chip. I have one. Outside of gaming tho, its worse than 7700X which barely is mid-end.

Lets hope 9800X3D will run vastly higher clockspeeds. Because 9900X3D and 9950X3D won't even beat 7800X3D in gaming due to dual CCD issues and latency.
 
14700K might loose slightly in gaming, but destroys 7800X3D outside of gaming, you know, some people want good all-round performance, not just good performance in gaming -or- applications.

Even in gaming, 14700K is only like 5-10% slower in 1080p and in 4K/UHD there's pretty much no difference.

Who cares tho, Arrow Lake hits on 3nm in a few weeks. This is what AMD have to compete with, not 2 year old arch like 13th and 14th gen.

The fact that 9700X loses to 14700K in gaming is a massive fail if you ask me. Considering 9900X and 9950X are worse for gaming.

AMD needs 9000X3D to compete, but will still loose in produtivity. 3D chips are mediocre for productivity, and non-3D chips are mediocre for gaming.

See, AMD can't deliver all in a single chip. You have to choose between gaming and produtivity performance. That is their big problem. 7800X3D is a great gaming chip. I have one. Outside of gaming tho, its worse than 7700X which barely is mid-end.

Lets hope 9800X3D will run vastly higher clockspeeds. Because 9900X3D and 9950X3D won't even beat 7800X3D in gaming due to dual CCD issues and latency.
Your whole post is pretty much BS. First, 14700K is only good for benchmarks, everywhere else it just sucks because Intel hybrid architecture is fail from start.

You say 5-10% does not make difference, well, that's around same improvement you can expect from Arrow Lake.

3D chips are excellent for both productivity AND gaming. Dual chiplet does not actually harm gaming if you just make some tweaks. And Intel CPUs also suck everywhere unless you do tweaks so we are even here. AMD delivers something that works on everything, Intel does not. Good to remember gamers are able to make OS level tweaks, those who use computers on working environment has much more restrictions.
 

Similar threads