WYP
News Guru
Expect a 35-45% performance boost over Turing's Pascal-based counterparts.

Read more about Nvidia's Tom Peterson discussing Turing.

Read more about Nvidia's Tom Peterson discussing Turing.
Last edited:
I'll believe it when I see it
Considering the amount of waffle they came out with the other day saying just buy it how can anyone take these guys seriously?
I assume then those games that DLSS stats already have the algorithms and will be available at launch? meaning in ARK I would get more than double the FPS than I would on a 1080?
I don't like this DLSS, seems like another money grab by nvidia
If these numbers were an accurate representation of what could be expected, 35-45%, I have no doubt nVidia would have came out of the gate touting that boost. I too think that those increases are what they are seeing using new tech like DLSS. If thats the case, its very shady in my opinion to represent it that way because you arent comparing apples to apples. All settings need to be the same across any given title to get a true idea of what the performance increase actually is.
TBH, it sounds like a more accurate take on the checkerboarding technique that is used on consoles. If the visual downsides are not noticeable in motion it is a huge win, but it is far from being a one-size-fits-all solution.
The fact that Nvidia says they need to crunch numbers on a supercomputer shows how difficult it is to train their AI. Not exactly a technique that I see coming to every game. Still a huge design win for Nvidia.
Why render to a full 4K if you can't notice the difference between it and Nvidia's Faux-K DLSS tech? The only problem with DLSS is that it is Nvidia only. We need something that can work on all GPUs.
Is it me or is this being heavily twisted?
Do I have it correct that this is an aliasing tech that they are comparing to an old one? I am sure I watched a vid on it the other day. If it is what I think it is they are not comparing it to their own FXAA which is much faster than other methods.
Wasn't that not Tom's Hardware? The Podcast with the interview was from Hot Hardware.
TBH, it sounds like a more accurate take on the checkerboarding technique that is used on consoles. If the visual downsides are not noticeable in motion it is a huge win, but it is far from being a one-size-fits-all solution.
The fact that Nvidia says they need to crunch numbers on a supercomputer shows how difficult it is to train their AI. Not exactly a technique that I see coming to every game. Still a huge design win for Nvidia.
Why render to a full 4K if you can't notice the difference between it and Nvidia's Faux-K DLSS tech? The only problem with DLSS is that it is Nvidia only. We need something that can work on all GPUs.
Is it me or is this being heavily twisted?
Do I have it correct that this is an aliasing tech that they are comparing to an old one? I am sure I watched a vid on it the other day. If it is what I think it is they are not comparing it to their own FXAA which is much faster than other methods.
I may be confused here, but those figures came from an anus. "The Founder's edition clock to 2.1ghz" well so does my Titan XP beyotch.