Wanted to make some Catchy titleNot just 10 series but basically all cards not having Nvidia's proprietary RTX tech
its better than then performance of 2080Ti DLSS and bit less then 3080 DLSS as they are second gen and FSR is first gen. I get around 144fps on 2k in warzone. make this constant and I am happy with FSR implementation.Let's see how it compares to DLSS 2.1. Anything close to DLSS & better than TAA upscaling would be good enough due to the higher adoption of this tech.
I am more interested in the quality comparison vs DLSS or existing TAA upscaling & the corresponding performance gains. Getting good image quality with 15-25% increase in performance is good enough in my books specially when compared to significantly lower quality with higher performance gains. Lets wait for Digital foundry analysis videos for that.its better than then performance of 2080Ti DLSS and bit less then 3080 DLSS as they are second gen and FSR is first gen. I get around 144fps on 2k in warzone. make this constant and I am happy with FSR implementation.
Unfortunately, that's not always the caseany thing which is non proprietary is bound to succeed .
That is what I said. its around 75% in terms of performance and somewhat less in terms of quality to dlss2.1. this was what leak suggested. but the results are more rock solid then expected since then. I think it will not disappoint us. especially when Nvidia don't even support their old cards for this tech.I am more interested in the quality comparison vs DLSS or existing TAA upscaling & the corresponding performance gains. Getting good image quality with 15-25% increase in performance is good enough in my books specially when compared to significantly lower quality with higher performance gains. Lets wait for Digital foundry analysis videos for that.
so you came with one example which no one cares .Unfortunately, that's not always the case
Lookat TressFX, AMD's equivalent of Hairworks. It both looked and performed better than Hairworks in the same demo (or game? I think I'm forgetting) on AMD as well as Nvidia cards, whereas Nvidia looked worse and had huge performance hit on both Amd and Nvidia, tho ofc not as big of a hit as Amd. The performance comparison is with respect to both technologies being switched off. So only time (and ofc money behind the scenes and under the table) will tell.
I was just giving an example saying that this might happen as well, just to temper expectationsso you came with one example which no one cares .
my exampl: open source USB vs Firewire/Thunderbolt
To be fair even the image on the left looks terrible to me lol. For some reason that game has always looked pretty bad to me - soft background details and excessively sharpened character details, which is visible here as wellUnfortunately the image quality at 1440p quality mode doesn't look that promising IMO. And this is the mode I think most people will be looking forward to the most, specially the ones with budget GPUs. 1080p modes would be significantly worse due to obvious reasons.
Take a look here -> https://images.anandtech.com/doci/16723/AMD_FSR_Example.jpg
Right, left hand side is also pretty bad but if you look at some of the static object likes the pillar on either side or the textures on the path way, you'll notice that the image with quality mode is a lot more blurry. Then again this is just the first example & I am sure this will vary more in different games & hopefully things will only improve in the future. Looking forward to June 22nd to see more information around this.To be fair even the image on the left looks terrible to me lol. For some reason that game has always looked pretty bad to me - soft background details and excessively sharpened character details, which is visible here as well