"Sadly, those 12GB, 16GB, and larger VRAM buffers are being underutilized here"
An opportunity opening for AMD here. If AMD can get developers on maximizing ( Saturating)
16 gigs of vram with higher quality textures. The conversation will move away from RT to vram. Tech media will praise AMD for having higher vram in the midrange. Those 12 and 8 vram cards will be even at bigger joke and AMD can probably start to move the needle with market share. This is the easiest short near term win without any performance penalty.

Also many are mentioning this title is extremely cpu bottlenecked. If possible can you do a cpu scaling benchmark for this title? No pressure.
Always a great Job!
 
And look what happens when Gimpworks or RaTX features are not implemented in games...

They just work...
 
Seems to me that the engine is limited to 120FPS.
 
The gap between 7900xtx and 4080 in 4k significantly closes from ultra to high. Are we sure something just isn't running in ultra on 7900xtx? The 7900xtx gains only 10 fps while the 4080 gains 17 fps.
That's 13% vs 27% gain! We shouldn't see that if all things otherwise equal.

Seems most likely that AMD GPUs are not running one or more of the settings at ultra rather than something AMD just runs that much better at ultra. Could explain poor FSR gain as well, the setting might kick back on at lower render.
 
Last edited:
"Sadly, those 12GB, 16GB, and larger VRAM buffers are being underutilized here"
An opportunity opening for AMD here. If AMD can get developers on maximizing ( Saturating)
16 gigs of vram with higher quality textures. The conversation will move away from RT to vram. Tech media will praise AMD for having higher vram in the midrange. Those 12 and 8 vram cards will be even at bigger joke and AMD can probably start to move the needle with market share. This is the easiest short near term win without any performance penalty.

Also many are mentioning this title is extremely cpu bottlenecked. If possible can you do a cpu scaling benchmark for this title? No pressure.
Always a great Job!
You underestimate the financial power that Nvidia has over a significant slice of game studios.

 
You underestimate the financial power that Nvidia has over a significant slice of game studios.
I don't. If we extrapolate the textures on some of Nvidia's own titles like Blackmyth Wukong and Cyberpunk 2077 they are lacking higher quality textures but heavily push rt performance. Nvidia wouldn't want it's 4070ti super and lower to be destroyed from higher quality textures and have the conversation on ray tracing performance. In this week's AMD press release the executive mentioned that they want developer support but are getting push back due to lack of market share. This is like the chicken and and egg analogy. Meanwhile AMD is a significant player in the consoles and now handhelds as well. Does AMD take themselves seriously or did they forget how to be competitive?

Update meanwhile dlss or rtx textures is probably a thing that will be upon us soon exclusive to rtx hardware.
 
"Sadly, those 12GB, 16GB, and larger VRAM buffers are being underutilized here. "

I feel there is nothing sad about this. When a game exceeds 8GB, developers get accused of not properly optimizing the game. So while there is a degrade in texture quality, something has got to give to allow people with 8GB cards to also enjoy the game. To be honest, the game looks reasonably well, and you don't need cutting edge graphics but end up with terrible performance as a result.
 
Great Review, and great news for the mid-range GPU owners!

Glad we don't need a 4090 to play this game.

Warhammer just added to my wish list.
 
I don't. If we extrapolate the textures on some of Nvidia's own titles like Blackmyth Wukong and Cyberpunk 2077 they are lacking higher quality textures but heavily push rt performance. Nvidia wouldn't want it's 4070ti super and lower to be destroyed from higher quality textures and have the conversation on ray tracing performance. In this week's AMD press release the executive mentioned that they want developer support but are getting push back due to lack of market share. This is like the chicken and and egg analogy. Meanwhile AMD is a significant player in the consoles and now handhelds as well. Does AMD take themselves seriously or did they forget how to be competitive?

Update meanwhile dlss or rtx textures is probably a thing that will be upon us soon exclusive to rtx hardware.
It would be very easy to release packs with high-resolution textures separately for those who have more Vram, it would even be more efficient for the size of the game to adapt to your system specs. Have you ever wondered why this wouldn't happen?

Nvidia wouldn't accept its GPUs being forced to run at a lower quality. Nvidia has insiders everywhere, it's very difficult to fight it without using the same tactics.
 
Seems this game is begging for a CPU benchmark, everyone keeps mentioning them but wont give up the goods.
 
It's very well optimized for PC.

I just fell over reading my own comment :D
 
This is an Excellent review...!

The thorough NATIVE breakdown of quality & resolution of each GPU is what us Gamer's want. This Review was very simple and Gamers can not only see the results, but easily compare the strength and weakness of each card.


Note to Steve:
Any GPU can upscale to suite or tailor a particular game, etc. It's not news anymore...
Thank you for this excellent GPU/Game review.
 
"Sadly, those 12GB, 16GB, and larger VRAM buffers are being underutilized here. "

I feel there is nothing sad about this. When a game exceeds 8GB, developers get accused of not properly optimizing the game. So while there is a degrade in texture quality, something has got to give to allow people with 8GB cards to also enjoy the game. To be honest, the game looks reasonably well, and you don't need cutting edge graphics but end up with terrible performance as a result.

One point doesn't negate the other. Investing in high-resolution textures would likely mean that those with 8GB of VRAM would need to run the game on medium tex. settings, which could be comparable to the current ultra settings mentioned in the article—hardly a bad compromise. Meanwhile, users with high-end hardware and more VRAM would be able to enjoy even better textures.
 
It would be very easy to release packs with high-resolution textures separately for those who have more Vram, it would even be more efficient for the size of the game to adapt to your system specs. Have you ever wondered why this wouldn't happen?

Nvidia wouldn't accept its GPUs being forced to run at a lower quality. Nvidia has insiders everywhere, it's very difficult to fight it without using the same tactics.



Building Higher Resolution textures is more difficult than increasing polygons on your 3D models.

For photo realistic graphics the textures are real photos from real objects.

You can't increase their resolution they will be fuzzy and with no extra details.

You can't "photoshop" it, it will look more cartoon like, and ugly.


Just forget it.
 
Sadly, those 12GB, 16GB, and larger VRAM buffers are being underutilized here

Let’s take a moment to thank NVIDIA for the stasis in texture quality that we’ve experienced in the last years in most games. Their seemingly endless array of high-performance GPUs with insufficient VRAM comes to mind. Cards with 3GB, 3.5GB (yes, you know the one), 6GB, and 8GB have hindered the development of exceptional game visuals. Then came the 3080 10GB to tell you again, that VRAM is not that important in a fast GPU. And do not forget that AMD did the same until RDNA-2.

This situation reminds me of Intel's approach to core counts in CPUs from Haswell on, which stagnated for years. For nearly a decade, you could rely on a 4-core/8-thread CPU, like the 4790K, due to Intel's deliberate slow pace of innovation. It wasn’t until AMD's RYZEN, but when competition came along, Intel suddenly began to heat up core count like stupid.

So, we need a healthy market with robust competition to drive progress. However, consumer awareness is equally important. Articles suggesting that "8GB of VRAM is enough" (such as those on this site, even from the same reviewer) haven’t been particularly helpful. I welcome any shift away from the "8GB will be sufficient for a long time" mantra. Additionally, we need more high-resolution texture packs as add-ons to set consumer expectations for new GPUs. Otherwise 12GB will be the new 8GB for a long time.

BTW:
Let’s not forget that this prolonged strategy of inadequate VRAM also impedes the development of local AI solutions, both now and in the near future. This issue even affects gaming, particularly as new AI-driven games are developed. NVIDIA’s own introduction of NVIDIA ACE for Games technology highlights this problem. Shooting yourself in the foot, one could say.
They used a low-standard LLM with 4 billion parameters (very very small) because their mid-range cards lack the VRAM needed to handle both the game and the AI model simultaneously. This was a misstep, and it will take years to address the resulting limitations on a large scale. So thanks, Nvidia ;)
 
I disagree with the quitter's mentality some have mentioned here as that argues for nothing ever improving and choosing some arbitrary point to stop developing higher resolution textures at. Why not 2GB VRAM? Or 256 MB?

If a game studio doesn't want to invest the time and money in better textures then that's an OK choice but it comes with lower sales as better visuals alone can drive consumer interest.

Spending the time making better textures adds a greater fidelity option for those with the VRAM to handle it and doesn't take away from setting textures to High or Medium for the rest of us. If reviewers want to focus solely on Max/Ultra textures and ignore how a real user will be playing on their 6600 or 3060 Ti then so be it (TechPowerUp). It means their data is irrelevant for the midrange player and we'll have to look elsewhere for realistic test conditions.

Like this article.
 
Something funky going on with either the engine or AMD drivers. The 6800xt has almost identical frame rates at medium, high, or ultra at the same resolution. Only going from 63 to 71 FPS going from ultra to medium? And clearly there is a 120 FPS cap in engine, no way all those GPUs are hitting the same wall.
 
Not sure the exact mission of HUB testing, but to give an idea of CPU, here's how the 5800X3D handles a Tyranid swarm on operation skyfire.
 
Something funky going on with either the engine or AMD drivers. The 6800xt has almost identical frame rates at medium, high, or ultra at the same resolution. Only going from 63 to 71 FPS going from ultra to medium? And clearly there is a 120 FPS cap in engine, no way all those GPUs are hitting the same wall.
Probably just some weirdness in the engine. I'm just sitting back and laughing at how my card (7900 XT, average retail $700USD) is basically neck-and-neck in 4k ultra with the 4080 Super (average retail $1000). Again. But yeah, something something Team Green better, AMD drivers bad, Radeon cards can't compete and should be cheaper, etc etc.
 
Something wrong here. The 7900GRE is weaker than the 7900XT, yet the GRE is ahead of the XT.
 
Something wrong here. The 7900GRE is weaker than the 7900XT, yet the GRE is ahead of the XT.

Where?

I just reviewed all the graphs and unless I missed something the 7900 XT is ahead of the 7900 GRE at all resolutions and qualities. They're close when CPU-limited as you'd expect but even there, a frame or two separates them.
 

Similar threads