Nvidia Launches Geforce 418.91 Drivers for Metro Exodus and Battlefield V with DLSS

Blurry As...

Well I certainly didn't expect DLSS to look much blurrier than TAA.... Boy was I wrong, DLSS looks much worse. Failed technology (at this point at least).
 
Well I certainly didn't expect DLSS to look much blurrier than TAA.... Boy was I wrong, DLSS looks much worse. Failed technology (at this point at least).

I do wonder though if its just metro we see it. Or has it been confirmed as the same problem in BFV?

edit* Johnny Guru confirmed in his updated review of DLSS in BFV that is is blurry there too.
 
Last edited:
Such a shame as it was one of the reasons I was most excited about owning a 2080 card.

Luckily I updated for the extra grunt too so not all is lost but still a bit of a disappointment given how good the FF XV benchmark seemed to look
 
DLSS is no joke. I'm running BFV at 1440 on Ultra, DXR on low. Beforehand I'd get about 50-60fps, now I'm 90+

Doesn't look noticeably different when playing.

It's the very first public implementation in a game too so it'll get better as driver support improves I think.
 
If it's anything like BF1 I guess your screen is so full of digital dirt and blur effects and distortions and stuff half the time on default settings that it doesn't make a massive difference. (I do really enjoy the visual overload of BF1 though much better than the vaseline suppression of BF3/4)
 
Blur makes me sick. Like, literally sick. I have to disable it as much as I can. Motion blur in games makes me go boss eyed if I look at it.

I did see a vid on YT the other day calling out DLSS but typically I can't find the link now.
 
DLSS is no joke. I'm running BFV at 1440 on Ultra, DXR on low. Beforehand I'd get about 50-60fps, now I'm 90+

Doesn't look noticeably different when playing.

It's the very first public implementation in a game too so it'll get better as driver support improves I think.

Not sure if we have seen the same resolutions then.

I saw first hand DLSS @ 4k with BFV and the blurry effect is terrible. Not sure how you don't see it. It literally is night and day when I compare it off and on. I do hope driver support improves this though. It is the first iteration as you say, so lets hope
 
Not sure if we have seen the same resolutions then.

I saw first hand DLSS @ 4k with BFV and the blurry effect is terrible. Not sure how you don't see it. It literally is night and day when I compare it off and on. I do hope driver support improves this though. It is the first iteration as you say, so lets hope

From what I understand about DLSS so far (and no, I have not read every tech doc word for word) is that it cuts image quality where not "needed" to increase the FPS.

This has happened before. I can't remember who was guilty of it (but I am thinking AMD, around 2005?) and it did have a notable difference in image quality. It did, however, increase FPS and at least DLSS is optional, right?

So I guess it depends on your needs. Sounds like something that would benefit multi player games before but I would gladly take a cut in FPS for better imagine quality. I mean, that was part of why I loved Last Light so much because it was so damn realistic and scary.

I've also read that even if you disable all aliasing DLSS will still be faster. I would imagine again, because it cuts image quality where "needed" to deliver higher FPS.
 
I think the simplest explanation of DLSS you can get is it runs at a 33% lower resolution and then puts it through a special custom filter than upscales and anti aliases simulatenously. AFAIK there's nothing selecting which parts of the screen render at what resolution though presumably they focus training the algorithm on improving on the more common or higher priority aspects of the game, at least initially, and I guess the inferencing aspect could be prioritised to certain areas of the screen too if the tensor cores are being pushed.
 
Did some testing with BF V at 1440p...

I'm getting 100-120 fps with my settings, which is with DXR and DLSS off.

With DXR that drops to 40-55 fps, with DLSS and DXR (DLSS can't be used without DXR) I'm getting 60-80 fps.

But the visual impact is huge, much closer to just playing at 1080p than native 1440p. Looks outright horrible compared to, for instance, 3DMark Port Royal which I thought was decent enough - though I find 3DMark graphics a bit blurry even without DLSS.

It's pretty much useless for 2060, 1440p and BF V. The visual impact of DLSS is far larger than the benefit of DXR, and it isn't what I consider playable for a first person shooter(>100fps) in either of the cases. I understand that this is for now my subjective opinion but I might even make a video out of it, once I have time.


Edit: These screenshots from TechPowerUp! forum user MikjoA illustrates my issues perfectly: https://www.techpowerup.com/forums/threads/nvidia-dlss-test-in-battlefield-v.252545/#post-3994023
 
Last edited:
I think the simplest explanation of DLSS you can get is it runs at a 33% lower resolution and then puts it through a special custom filter than upscales and anti aliases simulatenously. AFAIK there's nothing selecting which parts of the screen render at what resolution though presumably they focus training the algorithm on improving on the more common or higher priority aspects of the game, at least initially, and I guess the inferencing aspect could be prioritised to certain areas of the screen too if the tensor cores are being pushed.

Incorrect.
https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-your-questions-answered/

Everybody needs to reads this blog.
 
Did you read the blog? it doesn't refute what I said at all. I mean, that blog is just vague marketing stuff that doesn't really state anything technical, here's some more links:
https://www.nvidia.com/en-us/geforc...-new-technologies-in-rtx-graphics-cards/#dlss
("we leverage Deep Learning Super-Sampling’s vastly-superior 64xSS-esque quality, and our high-quality filters, to reduce the game’s internal rendering resolution. This greatly accelerates performance, without a noticeably negative impact on image quality,") or the Turing deep dive:
https://devblogs.nvidia.com/nvidia-turing-architecture-in-depth/

It's just a filter(based on a neural network inference algorithm) that upscales & anti-aliases a lower resolution image. As stated in your link and my original post, pretty much all customisation or optimisation occurs at the training stage (IE on the super computer rather as part of the client algorithm).
 
Last edited:
Only thing that interested me was the comment about the blurry DLSS.

After all, most gamers don't care how the tech is done, they just care about the outcome. The outcome atm is poor quality so let's hope they do improve this and get a more clear visual AA.

As a person studying computer science I'm very interested in how it's done:D
 
Back
Top