Graphic Cards Opinion on GPUs with 8GB VRAM or less - not sufficient for AAA games in 2023

This has been the hot topic around the graphic card market since a few weeks after the release of some AAA games (looking at you hogwards legacy).

While it was evident that with time as more & more people pushed higher resolution monitors as their standard, GPU VRAM requirements would increase as well. But perhaps game developers are unaware of the VRAM capabilities of GPUs or they are targeting high end GPUs only as more & more games that are coming out seem to be struggling on the GPUs with less VRAM. These GPUs are very capable in terms of raw performance but are being highly held back by their VRAMs.

Despite all this, NVIDIA's new leaks suggest that their 'lower end' cards would be 8gb/6gb (4060-ti /4050).

Do you think the current rage around the VRAM limitations would force NVIDIA to up the memory on their unreleased cards? What are your opinions on this whole topic?

Personally, I think there is a good chance that they may revisit/revise the VRAM of their cards as the public voice increases similar to their 4070/4080 issue that occurred. They have also shown similar action with their re-release of 2060 cards back in the crypto mining space.
 

nRiTeCh

Skilled
Now 12gb is the bare minimum so obviously card will carry 12gb or 16 and top end might go upwards till 24gb.
But again the AIBs can come up wit strip-down version of these standard configs and will sell as Mini editions having 8/6gb for budget users or for ITX builds.
 
'12gb is the bare minimum' I agree with you & so would the huge lot of people facing issues. But the issue is GPU manufacturers not paying attention to these issues. Game developers too but can't really blame them when 'realism' is such a huge measuring factor in AAA games.
I understand what a budget gamer is as I am one myself but budget gamers aren't really the 'standard' right now imo. GPU manufacturers want to justify their high prices by giving us these low end GPU's but its about time they give even the budget gamers the 'standard' of GPU performance in 2023.
 

Psycharge

Recruit
Wouldn't the better use case then be lowering resolution and turning everything else up (except textures,etc to minimise the strain on VRAM)? Also optimisation in games seems to be really lacking these days, then again haven't tried any recent AAA myself so can't comment.
 

powervgx

Reseller
I don't think nvidia will do that. 8 GB is more than enough upto 2k for now and 12 GB is enough for 4k. Nvidia cards already have 8 GB in 3070 ti and 12 GB in 3080 ti. The power of GPU's is getting too much so the only way to sell higher end cards is to provide more VRAM only in high end cards. For example, a lot of people buy 3080 ti because it has 12 GB VRAM. I am playing Hogwarts Legacy at 4k ultra and the VRAM usage is 9.5 GB.
 

enthusiast29

Skilled
I don't think nvidia will do that. 8 GB is more than enough upto 2k for now and 12 GB is enough for 4k. Nvidia cards already have 8 GB in 3070 ti and 12 GB in 3080 ti. The power of GPU's is getting too much so the only way to sell higher end cards is to provide more VRAM only in high end cards. For example, a lot of people buy 3080 ti because it has 12 GB VRAM. I am playing Hogwarts Legacy at 4k ultra and the VRAM usage is 9.5 GB.
Sorry but you're very wrong. Nvidia is doing exactly the opposite as they are providing capable cards with low VRAM (3070/Ti/3080) and incapable cards with high VRAM (3060).
This is planned obsolescence. AMD on the other hand has not done so and like history AMD's cards age like fine wine due to this.
At 1080p Ultra, a 3070Ti has 1% low of 16FPS on TLOU and 6700XT does 50FPS on 1% lows. Intel Arc A770 does 40FPS on 1% lows in the same scenario. Completely unacceptable.

Anyways coming to the point of OP, Hogwarts Legacy didn't have much of an issue like TLOU does. Even more so let me explain why the issue is there.

It's the issue of allocation vs actual use of VRAM, the game is allocating more than 8GB VRAM to use on those cards but doesn't really need to
Also AFAIK it happens only on Ultra settings.
TBH no one needs to play at Ultra, that setting is only meant for taking screenshots.
Quality difference between high and ultra is barely noticeable.
If you still wanna be the guy who cranks settings all the way up then you always need near flagship GPUs every other year.
Also Ultra @ 1080p is stupid and if you're @ 2K then you shouldn't have bought an 8GB card, that's on you.

I've been saying for quite a while here to people that 3070 and below cards are for 1080p high settings now. Sure they can do 2K on pre-2023 games but almost all coming games will need to be dialed back down to medium on 2K or 1080p high only.

*By using words like "you", "you're" etc. I'm not pointing to OP but every person who does this.
 

Psycharge

Recruit
Sorry but you're very wrong. Nvidia is doing exactly the opposite as they are providing capable cards with low VRAM (3070/Ti/3080) and incapable cards with high VRAM (3060).
This is planned obsolescence. AMD on the other hand has not done so and like history AMD's cards age like fine wine due to this.
Wasn't the 3060 changed from 6GB to 12GB because the competing card from AMD came with more VRAM? There was a difference in the type of RAM given too iirc. The 12GB isn't the same as the previously proposed 6GB.
 
I don't think nvidia will do that. 8 GB is more than enough upto 2k for now and 12 GB is enough for 4k. Nvidia cards already have 8 GB in 3070 ti and 12 GB in 3080 ti. The power of GPU's is getting too much so the only way to sell higher end cards is to provide more VRAM only in high end cards. For example, a lot of people buy 3080 ti because it has 12 GB VRAM. I am playing Hogwarts Legacy at 4k ultra and the VRAM usage is 9.5 GB.
I understand what you're saying & it's true. The fact is that no matter what/how cards are released there is always going to be an audience for it & ways to make them work in a 'good enough' way which is more than enough for the mass audience.
But what I'm trying to say is that cards performance are being limited by their VRAM. As you mentioned 'for now', what about in 2 years? It's obvious that the 3060ti is going to be held back massively in the long run due to its VRAM.
A good example of this is how far 1060 has come. It was released in 2018 & up until last year (I believe) it was the top card according to steam. We didn't have these VRAM conversations before and we are reaping the good results till now.

Again, These cards are more than good enough for a wide range of audience, It's just that these very capable cards being held back by VRAM seems a bit unfair imo.
 

powervgx

Reseller
@justanotheraverageguy you asked a question and I answered it from NVIDIA Perspective. I didn't give my opinion on which card should have how much VRAM. I just explained that the VRAM is more than enough even after 2 years of these cards launch so why would Nvidia bother to add more? That was from a company's perspective not from games, devs or user perspective. Obviously it is better to have more VRAM. Obviously it is better for all mid range cards to have 12 GB and high end cards to have 16-20 GB. The VRAM is not low by today's standard and Nvidia does not make products 4 years into the future so it is difficult to expect more from NVIDIA.