Nvidia DLSS has been added to Final Fantasy XV: Windows Edition to boost performance

From what I have been reading it's only worth having a 2080Ti for 4k. Apparently at 1440p games are so held back by the CPU that it (the Ti) is only around 15% quicker than the 2080.

And let's face it, who in their right mind would buy either of those cards for anything less than 4k?

4k will become (eventually) the new 1080p IMO. 1440p isn't a industry wide accepted resolution (like, I've never seen a 1440p TV set) so yeah, 4k is where it's at.

It's taken a lot longer than the jump to 1080p (due to the high prices of monitors and the weakness of GPUs at launch) but I think within the next 2-3 years pretty much every one will be on 4k. Once the entry level mid range GPUs can start cranking it out OK we should see monitor prices drop (because more will be buying them) and thus more people making the upgrade.
 
Of course it is. Why would you lock it? Well probably because at 4k it looks the greatest and you cannot tell a difference with it on against other AA formats.

Whereas at 1440/1080 you could probably tell it looks slightly worse and therefore people will make fun of it. Despite it's performance improvement if any
 
I mean it makes sense for the 2080 (Ti), but I was looking forward to DLSS boost for 2070 and the upcoming 2060 as well. Hope it's not a hard limitation.
 
Why would you need AA at 4k resolution?
I suppose if you're going through a big screen, but my 4k monitor's a 27 inch screen, absolutely no need for AA at that pixel density.
 
Why would you need AA at 4k resolution?
I suppose if you're going through a big screen, but my 4k monitor's a 27 inch screen, absolutely no need for AA at that pixel density.

Because resolution does not fix shading and lighting aliasing.
 
It might be a case of DLSS looks better at increasing resolutions but starts to be detrimental below 4K perhaps.

Well. You're more of an expert in this area than I am :)

maybe there is no fps gain at the lower res due to bottlenecking? so I guess showboating DLSS on and off at 1440p where the fps remains the same, is not a good martketting ploy. From what I gather, 2080ti is quite tasking on the cPU as opposed to 1080ti and older.
 
Well. You're more of an expert in this area than I am :)

maybe there is no fps gain at the lower res due to bottlenecking? so I guess showboating DLSS on and off at 1440p where the fps remains the same, is not a good martketting ploy. From what I gather, 2080ti is quite tasking on the cPU as opposed to 1080ti and older.

Well I don't know much about this stuff, I'm just theorising. I've not really thought about it too much but these cards like you say will easily run the lower resolution so not much need below 4K maybe.
 
Ok, Had a little look into it.
So it's not really 4k for the DLSS in Final Fantasy 15 - it's 1440p upscaled then supersampled. This spreads the load onto the Tensor cores and gives faster FPS.

https://www.youtube.com/watch?v=Jpd5j5W1NZw

Higher FPS, for similar-ish image quality, though the true 4k looks a bit better in some instances.
 
Last edited:
Yeah that's what DLSS is :). It reconstructs a high resolution image from a lower one.
Now why couldn't Custom PC have came up with a nice simple explanation like that, in their RTX review, instead of rambling on about a 4x4 matrix of numbers, multiplied by another 4x4 matrix added to another 4x4 matrix (I glazed over at that point). Absolutely no mention of that.

Having googled it, I guess it's down to Nvidia being vague about what it does, and sites like yourselves and Tomshardware having to figure it out.

Interesting that image quality improves during a scene - good for people like myself who will hit the pause button and put the kettle on - which implies that the card is doing the learning, or at least some of it.
 
Quote from another forum, because I'm lazy, as to why it's only at 4k. Pretty succinct;

''Well the whole purpose of DLSS was to make 4k gaming more viable in demanding games so it doesn't really have much value at lower resolutions.
It could be made to work but then there would have to be new models generated. By default DLSS will render at something like 1800P and then upscale to 4k giving an image quality that should be better than 4K with standard temporal AA.
I imagine there might be a value in a 1080p mode for low end cards that render at say 850p and upscale, but the low end cards would need the Tensor cores so it is not obvious if that would make sense vs simply offering more CUDA cores.'
 
(TLDR: 1080p output would need a 720p input, 1440p would need 960p, both of these would create terrible images)

Tensor cores are big mixed precision(FP16/FP32) Fused-Multiply-Add "calculators" primarily optimised for 4x4 matrices (A very special and common type of matrices when it comes to graphics pipelines as it allows us to perform calculations in the 4th dimension, which makes it much quicker to do 3D manipulation). Basically they're made mostly to perform one operation(They can do other related operations but this is where their greatest speed up is): They multiply two FP16 numbers then add the resulting FP32 number to another FP32 number, in one step.

In theory, and presumably loosely similar to how NVidia are using it, you can use two rows/columns of pixels as an input to system and get a "middle" row as the output(It won't actually be split up into rows/columns steps but this is to explain the geometry that underpins the x1.5 resolution scaling inherent to this technology).
What is (2+1)/2? 3/2
What is 1440p*3/2 (IE What would 1440p be if you crammed a new row/column between every original pair) = 2160p

Basically, DLSS is primarily an upscailing algorithm, likely with some TAA techniques mixed in too, and with this mode of operation the output resolution will always be x1.5(Per dimension) the input(As you can workout this means that a 1440p image would need a 960p input, which would likely make it look worse than a native 1440p image with minimal benefits, 1080p would need a 720p input). But whenever you have interpolation in this way, then the ABSOLUTE MINIMUM without "Skipping rows" is a 3/2(x1.5) increase in resolution, regardless of the technique used in the filter.
 
Last edited:
Back
Top