Posts: 163   +4
Staff
In context: Upscaling tech like Nvidia's DLSS can enhance lower-resolution images and improve image quality while achieving higher frame rates. However, some gamers are concerned that this technology might become a requirement for good performance – a valid fear, even though only a few games currently list system requirements that include upscaling. As the industry continues to evolve, how developers address these concerns remains to be seen.

AI, in its current primitive form, is already benefiting a wide array of industries, from healthcare to energy to climate prediction, to name just a few. But when asked at the Goldman Sachs Communacopia + Technology Conference in San Francisco last week which AI use case excited him the most, Nvidia CEO Jensen Huang responded that it was computer graphics.

"We can't do computer graphics anymore without artificial intelligence," he said. "We compute one pixel, we infer the other 32. I mean, it's incredible... And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible."

Jensen is doubling down on observations that Nvidia and other tech executives have made about AI-based upscaling in PC gaming, arguing that it is a natural evolution in graphics technology, similar to past innovations like anti-aliasing or tessellation.

These executives also see AI-based upscaling as increasingly necessary. Graphics technology has the potential to become even more resource-intensive, and hardware-based AI upscaling techniques can help achieve playable frame rates on a wider variety of systems, from handheld and gaming consoles to high-end desktop machines.

It's no surprise that Nvidia is pushing for this future, as its own DLSS upscaling technology is one of the most prominent examples of AI-based upscaling in gaming. The AI model is trained on high-quality images to reconstruct details and produce sharper, cleaner results.

Another competing upscaling technology is Intel's XeSS, which provides up to a 2x performance boost in some games, allowing for higher frame rates without significantly sacrificing image quality (your mileage will vary, of course). XeSS works a lot like Nvidia's DLSS, with the key difference that Intel XeSS supports graphics cards from multiple vendors, while DLSS is limited to Nvidia graphics cards.

This trend, however, worries some gamers who fear the technology will become essential to good performance. Remnant II set what some consider a dangerous precedent by listing system requirements that assume players are using upscaling. It stands out for explicitly mentioning DLSS in its specs, but many modern games are designed with AI upscaling technologies in mind, even if not included in system requirements.

These concerns have some merit, but despite the reservations, AI-based upscaling seems poised to become a significant trend in computer graphics, whether we like it or not.

AMD's upscaling technology, known as FSR (FidelityFX Super Resolution), currently uses a combination of spatial and temporal upscaling, anti-aliasing, and other techniques to enhance image quality from lower-resolution inputs. However, Jack Huynh, Senior Vice President of AMD's Computing and Graphics division, recently announced that the upcoming FSR 4.0 will incorporate AI – potentially the same implementation expected to debut on the PlayStation 5 Pro.

Microsoft has also introduced their own feature called Auto Super Resolution, which uses AI to upscale games in real-time. It requires specific hardware, including an NPU with at least 40 TOPS of processing power. Currently, it is limited to certain games available through the Xbox app for Windows PCs, with supported titles including Borderlands 3, Control, God of War, and The Witcher 3.

Permalink to story:

 
So back in 2019 when DLSS and upscaling was a brand new technology, it was a valid concern that developers might make games that relied on the technology to be playable, given that a lot of people at that time wouldn't have had graphics cards that were capable of performing it.

We're ~5 years on from that now. The past 3 generations of (NVidia) graphics cards support this technology. It is becoming much more reasonable for developers to now assume the availability of this tech and design their games explicitly with it in mind.

Would we prefer that all games can run at 200+ FPS with maximum settings at native 4k resolution? Sure. If we also assume that this isn't possible (yet), then I don't see why we should complain when developers use all the tools at their disposal to get their desired balance of graphics vs performance. Especially as the tech has continued to improve dramatically over the short space of time it's had to evolve.
 
Jensen (probably): Graphics just aren't feasible anymore. GPUs just don't know how to generate images and our team that creates drivers are stumped! We just can't go on without AI to do this work for us! Let us rejoice to the world that even though we have lost the technology to do this ourselves, we can thank me AI for solving our dilemma and getting us back into the work of being able to create graphics again!
 
Would we prefer that all games can run at 200+ FPS with maximum settings at native 4k resolution? Sure. If we also assume that this isn't possible (yet), then I don't see why we should complain when developers use all the tools at their disposal to get their desired balance of graphics vs performance. Especially as the tech has continued to improve dramatically over the short space of time it's had to evolve.

I wouldn't mind games making use of upscaling and frame generation when targets are 4k+ resolutions at 120+ fps. Makes sense.

However, there are now games relying on upscaling and frame gen even for 1080p / 60 fps targets - and with relatively steep system requirements. I find that unnaceptable. Developers are using these technologies as crutches for laziness and incompetence.
 
We're ~5 years on from that now. The past 3 generations of (NVidia) graphics cards support this technology.

I think that the question is what "requires upscaling" implies. If it implies that a game can't run at 1080p at a reasonable frame rate on a $300 card, then I see this as a problem. DLSS is okay at upscaling at 1080p, but it's still not as good as at higher resolutions, and FSR is crap at 1080p.

Also, frame generation (DLSS 3) requires the latest gen. It won't be too surprising if NVIDIA introduces yet another DLSS update which requires GeForce 5000. As long as there's no good, cross hardware solution, upscaling shouldn't be mandatory.
 
I wouldn't mind games making use of upscaling and frame generation when targets are 4k+ resolutions at 120+ fps. Makes sense.

However, there are now games relying on upscaling and frame gen even for 1080p / 60 fps targets - and with relatively steep system requirements. I find that unnaceptable. Developers are using these technologies as crutches for laziness and incompetence.
You're not wrong. But that's a problem with developers, not with the tech. The tech is great.
 
I think that the question is what "requires upscaling" implies. If it implies that a game can't run at 1080p at a reasonable frame rate on a $300 card, then I see this as a problem. DLSS is okay at upscaling at 1080p, but it's still not as good as at higher resolutions, and FSR is crap at 1080p.

Also, frame generation (DLSS 3) requires the latest gen. It won't be too surprising if NVIDIA introduces yet another DLSS update which requires GeForce 5000. As long as there's no good, cross hardware solution, upscaling shouldn't be mandatory.
I'm with you on the cross-platform/hardware/software solution. All these different formats is going to become a headache.
As for feature limiting by generation, that's more grey. If the limit is purely arbitrary in software, then that's bad. If it's legitimate "the old hardware can't do it", then it's just inevitable that they're going to release things which only the new generation can do, eventually.
 
What Jensen says actually here is that - games will run good but only if you use pur cards and our solution. CUDA way of doing this. Make no mistake, whatever the topic is, nvda are talking about money. All companies are but nvda are on a league of their own on that.
 
As hardware and AI both improve yearly, this will not be an issue for long… soon, we’ll be at the point where gaming effects become lifelike - or indistinguishable from lifelike - and the hardware industry will need to create some other reason why you “must” have the latest GPU (or whatever they’re called by then)… AI will only hurry this along…

In 20 years (possibly less), people will look back at these debates and mock us…
 
AMD really have no choice, so have to say this. They need a cuda alternative.
You need "AI" for enhancing sound, better encoding , better play back of movies , better numbers to be in Microsoft good books with NPU units etc

Many people buy Nvidia for the extra tools they bring, If you just want rasterization value for money AMD wins.

Plus synergies in R&D for AI market

Plus if you build it, they will come, and Nvidia did that with Cuda long ago. Those in the field who really knew the significance of Cuda etc could have made easy easy money buying Nvidia stock, How many really bought the stock based on Cuda , instead they saw Nvidia just coming on nicely making higher gross profits on GPUs etc
 
Can't wait for this Ai crap to be over and done with. Gov only supports it because it gives them more ways to spy on people.
 
If you're not happy where things are going then don't buy the products.
But we know you will...
 

Similar threads