Posts: 165   +4
Staff
In context: Upscaling tech like Nvidia's DLSS can enhance lower-resolution images and improve image quality while achieving higher frame rates. However, some gamers are concerned that this technology might become a requirement for good performance – a valid fear, even though only a few games currently list system requirements that include upscaling. As the industry continues to evolve, how developers address these concerns remains to be seen.

AI, in its current primitive form, is already benefiting a wide array of industries, from healthcare to energy to climate prediction, to name just a few. But when asked at the Goldman Sachs Communacopia + Technology Conference in San Francisco last week which AI use case excited him the most, Nvidia CEO Jensen Huang responded that it was computer graphics.

"We can't do computer graphics anymore without artificial intelligence," he said. "We compute one pixel, we infer the other 32. I mean, it's incredible... And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible."

Jensen is doubling down on observations that Nvidia and other tech executives have made about AI-based upscaling in PC gaming, arguing that it is a natural evolution in graphics technology, similar to past innovations like anti-aliasing or tessellation.

These executives also see AI-based upscaling as increasingly necessary. Graphics technology has the potential to become even more resource-intensive, and hardware-based AI upscaling techniques can help achieve playable frame rates on a wider variety of systems, from handheld and gaming consoles to high-end desktop machines.

It's no surprise that Nvidia is pushing for this future, as its own DLSS upscaling technology is one of the most prominent examples of AI-based upscaling in gaming. The AI model is trained on high-quality images to reconstruct details and produce sharper, cleaner results.

Another competing upscaling technology is Intel's XeSS, which provides up to a 2x performance boost in some games, allowing for higher frame rates without significantly sacrificing image quality (your mileage will vary, of course). XeSS works a lot like Nvidia's DLSS, with the key difference that Intel XeSS supports graphics cards from multiple vendors, while DLSS is limited to Nvidia graphics cards.

This trend, however, worries some gamers who fear the technology will become essential to good performance. Remnant II set what some consider a dangerous precedent by listing system requirements that assume players are using upscaling. It stands out for explicitly mentioning DLSS in its specs, but many modern games are designed with AI upscaling technologies in mind, even if not included in system requirements.

These concerns have some merit, but despite the reservations, AI-based upscaling seems poised to become a significant trend in computer graphics, whether we like it or not.

AMD's upscaling technology, known as FSR (FidelityFX Super Resolution), currently uses a combination of spatial and temporal upscaling, anti-aliasing, and other techniques to enhance image quality from lower-resolution inputs. However, Jack Huynh, Senior Vice President of AMD's Computing and Graphics division, recently announced that the upcoming FSR 4.0 will incorporate AI – potentially the same implementation expected to debut on the PlayStation 5 Pro.

Microsoft has also introduced their own feature called Auto Super Resolution, which uses AI to upscale games in real-time. It requires specific hardware, including an NPU with at least 40 TOPS of processing power. Currently, it is limited to certain games available through the Xbox app for Windows PCs, with supported titles including Borderlands 3, Control, God of War, and The Witcher 3.

Permalink to story:

 
So back in 2019 when DLSS and upscaling was a brand new technology, it was a valid concern that developers might make games that relied on the technology to be playable, given that a lot of people at that time wouldn't have had graphics cards that were capable of performing it.

We're ~5 years on from that now. The past 3 generations of (NVidia) graphics cards support this technology. It is becoming much more reasonable for developers to now assume the availability of this tech and design their games explicitly with it in mind.

Would we prefer that all games can run at 200+ FPS with maximum settings at native 4k resolution? Sure. If we also assume that this isn't possible (yet), then I don't see why we should complain when developers use all the tools at their disposal to get their desired balance of graphics vs performance. Especially as the tech has continued to improve dramatically over the short space of time it's had to evolve.
 
Jensen (probably): Graphics just aren't feasible anymore. GPUs just don't know how to generate images and our team that creates drivers are stumped! We just can't go on without AI to do this work for us! Let us rejoice to the world that even though we have lost the technology to do this ourselves, we can thank me AI for solving our dilemma and getting us back into the work of being able to create graphics again!
 
Would we prefer that all games can run at 200+ FPS with maximum settings at native 4k resolution? Sure. If we also assume that this isn't possible (yet), then I don't see why we should complain when developers use all the tools at their disposal to get their desired balance of graphics vs performance. Especially as the tech has continued to improve dramatically over the short space of time it's had to evolve.

I wouldn't mind games making use of upscaling and frame generation when targets are 4k+ resolutions at 120+ fps. Makes sense.

However, there are now games relying on upscaling and frame gen even for 1080p / 60 fps targets - and with relatively steep system requirements. I find that unnaceptable. Developers are using these technologies as crutches for laziness and incompetence.
 
We're ~5 years on from that now. The past 3 generations of (NVidia) graphics cards support this technology.

I think that the question is what "requires upscaling" implies. If it implies that a game can't run at 1080p at a reasonable frame rate on a $300 card, then I see this as a problem. DLSS is okay at upscaling at 1080p, but it's still not as good as at higher resolutions, and FSR is crap at 1080p.

Also, frame generation (DLSS 3) requires the latest gen. It won't be too surprising if NVIDIA introduces yet another DLSS update which requires GeForce 5000. As long as there's no good, cross hardware solution, upscaling shouldn't be mandatory.
 
I wouldn't mind games making use of upscaling and frame generation when targets are 4k+ resolutions at 120+ fps. Makes sense.

However, there are now games relying on upscaling and frame gen even for 1080p / 60 fps targets - and with relatively steep system requirements. I find that unnaceptable. Developers are using these technologies as crutches for laziness and incompetence.
You're not wrong. But that's a problem with developers, not with the tech. The tech is great.
 
I think that the question is what "requires upscaling" implies. If it implies that a game can't run at 1080p at a reasonable frame rate on a $300 card, then I see this as a problem. DLSS is okay at upscaling at 1080p, but it's still not as good as at higher resolutions, and FSR is crap at 1080p.

Also, frame generation (DLSS 3) requires the latest gen. It won't be too surprising if NVIDIA introduces yet another DLSS update which requires GeForce 5000. As long as there's no good, cross hardware solution, upscaling shouldn't be mandatory.
I'm with you on the cross-platform/hardware/software solution. All these different formats is going to become a headache.
As for feature limiting by generation, that's more grey. If the limit is purely arbitrary in software, then that's bad. If it's legitimate "the old hardware can't do it", then it's just inevitable that they're going to release things which only the new generation can do, eventually.
 
What Jensen says actually here is that - games will run good but only if you use pur cards and our solution. CUDA way of doing this. Make no mistake, whatever the topic is, nvda are talking about money. All companies are but nvda are on a league of their own on that.
 
As hardware and AI both improve yearly, this will not be an issue for long… soon, we’ll be at the point where gaming effects become lifelike - or indistinguishable from lifelike - and the hardware industry will need to create some other reason why you “must” have the latest GPU (or whatever they’re called by then)… AI will only hurry this along…

In 20 years (possibly less), people will look back at these debates and mock us…
 
AMD really have no choice, so have to say this. They need a cuda alternative.
You need "AI" for enhancing sound, better encoding , better play back of movies , better numbers to be in Microsoft good books with NPU units etc

Many people buy Nvidia for the extra tools they bring, If you just want rasterization value for money AMD wins.

Plus synergies in R&D for AI market

Plus if you build it, they will come, and Nvidia did that with Cuda long ago. Those in the field who really knew the significance of Cuda etc could have made easy easy money buying Nvidia stock, How many really bought the stock based on Cuda , instead they saw Nvidia just coming on nicely making higher gross profits on GPUs etc
 
Can't wait for this Ai crap to be over and done with. Gov only supports it because it gives them more ways to spy on people.
 
If you're not happy where things are going then don't buy the products.
But we know you will...
 
Absolutely, it's required because you've abandoned putting much effort to improving rasterisation performance and crippled bandwidth to pump L2 cache as a sop. You are now solely relying on frame gen and upscaling to do the heavy lifting.

Huang makes a lot of BS over-the-top statements including his gem "Kids you don't need to learn to code, let the AI do it for you". IMO the guy is a disgrace holding back the industry and pumping his own proprietary standards that are hurting everyone.
 
I wouldn't mind games making use of upscaling and frame generation when targets are 4k+ resolutions at 120+ fps. Makes sense.

However, there are now games relying on upscaling and frame gen even for 1080p / 60 fps targets - and with relatively steep system requirements. I find that unnaceptable. Developers are using these technologies as crutches for laziness and incompetence.
It took Jedi survivor 16 months to finally patch their game to improve performance this week.
Nvidia creates ai features. Publishers look we can cut costs by using the power of ai for game development/ graphics rendering. ( Publishers and game studios go on a firing frenzy)
AMD to developers why don't you want to program for our hardware? ( shows the disconnect by stating this publicly)
Also Nvidia loves the brute force approach to graphics rendering. 5090 for 4k 60fps anyone?
AMD is embracing ai but somehow is focused on the niche handheld battery life improvement via fsr4.
With ai the whole industry is literally putting the cart before the horse imo!
 
Last edited:
Moore law is dead and Jensen killed it on purpose. And now the guy thinks that he's the next Steve Jobs and AI is the new iPhone. Can't wait for his next video where he take's out of the owen the 1000W - $500.000 - 5090 GPU.
 
Jensen (probably): Graphics just aren't feasible anymore. GPUs just don't know how to generate images and our team that creates drivers are stumped! We just can't go on without AI to do this work for us! Let us rejoice to the world that even though we have lost the technology to do this ourselves, we can thank me AI for solving our dilemma and getting us back into the work of being able to create graphics again.

This guy (probably): I drive a truck and electric vehicles make me uncomfortable.

AI image generation is going to go down as one of the best things to happen to gaming graphics. It’s an entirely new paradigm in rendering. Yes it looks a bit sloppy sometimes now, but think about the future. The quote’s right in the article “ we compute 1 pixel, we infer the other 32”.that inference will allow for faster rendering times requiring a fraction of the compute. It’s no longer an artist having to fill in a grid, one box at a time to make an image, it’s an artist getting a rough sketch and being to paint the whole canvas from that
 
To be precise, Jensen is saying, you want high FPS, I can get the AI to generate as much as you want to make you feel happy. But the reality, the latency is still crap because the underlying/ real FPS is still crap. Let’s be real, even if you can use the Tensor cores to generate predict and throw out fake frames, AI is ram intensive, and the Nvidia will surely be happy peddling expensive GPUs with 6 or 8GB VRAM.
 
This guy (probably): I drive a truck and electric vehicles make me uncomfortable.

AI image generation is going to go down as one of the best things to happen to gaming graphics. It’s an entirely new paradigm in rendering. Yes it looks a bit sloppy sometimes now, but think about the future. The quote’s right in the article “ we compute 1 pixel, we infer the other 32”.that inference will allow for faster rendering times requiring a fraction of the compute. It’s no longer an artist having to fill in a grid, one box at a time to make an image, it’s an artist getting a rough sketch and being to paint the whole canvas from that


Then again, some people like made up stuff - it's what AI is good at anyway, look at all the fake generated images that people post as being real. I guess if that's good enough for you then you can have it, but I don't want it.
 
DLSS was implemented as an excuse for how bad Ray Tracing implementation was.

Where is Tim and its pitchfork for denouncing Nvidia gimping games to 1080p 60FPS with their gameworks implementation with Ray Tracing?

At least Jedi Survivor is playable above 45FPS at 2160p Native resolution with an AMD GPU and Ray Tracing activated.
 
Nvidia CEO: "We can't do computer graphics anymore without artificial intelligence"....yes you can
 
Next gen consoles will FG from 30 to 60fps, I’m sure of it.
Framegen only smooths gameplay, it's not meant to pump a crippled framerate. 30fps as a basis for 60fps framegen will be just as bad an experience. Nvidia largely omits this fact. They will rely on upscaling, but it will look crap. Anything under 1440p native looks crap for upscaling and 1080p performance mode uses a garbage grade 540p. Hell even 1440p performance mode using 720p looks trash too.
 
Nvidia CEO: "We can't do computer graphics anymore without artificial intelligence"
Hush up Jensen. We all know your statement has nothing to do with reality and everything to do with selling hardware.

When people make statements like that, the credibility wanes..
 
At least Jedi Survivor is playable above 45FPS
That is not acceptable. 60FPS is minimum for a good experience. 30FPS is playable if it's consistent and most times it isn't.
 

Similar threads