Why would you need AA at 4k resolution?
I suppose if you're going through a big screen, but my 4k monitor's a 27 inch screen, absolutely no need for AA at that pixel density.
Ok, there is a need thenBecause resolution does not fix shading and lighting aliasing.
Ok, there is a need then
If DLSS runs faster and looks better than TAA, then I don't see why they don't enable it at all resolutions.
That is the question you should have asked from the start. I am curious too.
It might be a case of DLSS looks better at increasing resolutions but starts to be detrimental below 4K perhaps.
Well. You're more of an expert in this area than I am
maybe there is no fps gain at the lower res due to bottlenecking? so I guess showboating DLSS on and off at 1440p where the fps remains the same, is not a good martketting ploy. From what I gather, 2080ti is quite tasking on the cPU as opposed to 1080ti and older.
Ok, Had a little look into it.
So it's not really 4k for the DLSS in Final Fantasy 15 - it's 1440p upscaled then supersampled. This spreads the load onto the Tensor cores and gives faster FPS.
https://www.youtube.com/watch?v=Jpd5j5W1NZw
Now why couldn't Custom PC have came up with a nice simple explanation like that, in their RTX review, instead of rambling on about a 4x4 matrix of numbers, multiplied by another 4x4 matrix added to another 4x4 matrix (I glazed over at that point). Absolutely no mention of that.Yeah that's what DLSS is . It reconstructs a high resolution image from a lower one.
Having googled it, I guess it's down to Nvidia being vague about what it does, and sites like yourselves and Tomshardware having to figure it out.
Because resolution does not fix shading and lighting aliasing.