Nvidia's Tom Peterson answers Geforce RTX/Turing questions

Let's be honest, the RTX 2080 offering a 35-45% boost over the GTX 1080 isn't that impressive, especially given the price increase. Feels very GTX 1080 Ti territory.

TBH, I think Nvidia needs to get DLSS used by as many devs as they can to make the performance gains feel worthwhile, though this assumes that DLSS on VS off delivers an image quality difference that is almost unnoticeable in practice.
 
Is it me or is this being heavily twisted?

Do I have it correct that this is an aliasing tech that they are comparing to an old one? I am sure I watched a vid on it the other day. If it is what I think it is they are not comparing it to their own FXAA which is much faster than other methods.

I may be confused here, but those figures came from an anus. "The Founder's edition clock to 2.1ghz" well so does my Titan XP beyotch.
 
I assume then those games that DLSS stats already have the algorithms and will be available at launch? meaning in ARK I would get more than double the FPS than I would on a 1080?

I don't like this DLSS, seems like another money grab by nvidia
 
I assume then those games that DLSS stats already have the algorithms and will be available at launch? meaning in ARK I would get more than double the FPS than I would on a 1080?

I don't like this DLSS, seems like another money grab by nvidia

TBH, it sounds like a more accurate take on the checkerboarding technique that is used on consoles. If the visual downsides are not noticeable in motion it is a huge win, but it is far from being a one-size-fits-all solution.

The fact that Nvidia says they need to crunch numbers on a supercomputer shows how difficult it is to train their AI. Not exactly a technique that I see coming to every game. Still a huge design win for Nvidia.

Why render to a full 4K if you can't notice the difference between it and Nvidia's Faux-K DLSS tech? The only problem with DLSS is that it is Nvidia only. We need something that can work on all GPUs.
 
If these numbers were an accurate representation of what could be expected, 35-45%, I have no doubt nVidia would have came out of the gate touting that boost. I too think that those increases are what they are seeing using new tech like DLSS. If thats the case, its very shady in my opinion to represent it that way because you arent comparing apples to apples. All settings need to be the same across any given title to get a true idea of what the performance increase actually is.
 
If these numbers were an accurate representation of what could be expected, 35-45%, I have no doubt nVidia would have came out of the gate touting that boost. I too think that those increases are what they are seeing using new tech like DLSS. If thats the case, its very shady in my opinion to represent it that way because you arent comparing apples to apples. All settings need to be the same across any given title to get a true idea of what the performance increase actually is.

It must be remembered that the Nvidia conference started much later than expected, so perhaps they didn't have time to go into great detail. Several RTX game demos needed to be skipped to make up the time. Perhaps planned comments regarding performance were also cut.

He explicitly said that the 35-45% performance boost was for existing games. Without RTX support. The problem is that the RTX 2080 has GTX 1080 Ti pricing... So performance comparisons with the 1080 are somewhat useless from a value for money perspective.
 
TBH, it sounds like a more accurate take on the checkerboarding technique that is used on consoles. If the visual downsides are not noticeable in motion it is a huge win, but it is far from being a one-size-fits-all solution.

The fact that Nvidia says they need to crunch numbers on a supercomputer shows how difficult it is to train their AI. Not exactly a technique that I see coming to every game. Still a huge design win for Nvidia.

Why render to a full 4K if you can't notice the difference between it and Nvidia's Faux-K DLSS tech? The only problem with DLSS is that it is Nvidia only. We need something that can work on all GPUs.

The nature of machine learning requires many iterations, they probably use their super computer because it reduces the time considerably and can run more intense analysis. By the sounds of it, developers don't need to do all that much to get it to work, they just need to assist Nvidia by providing builds and some light support/insight?

Is it me or is this being heavily twisted?

Do I have it correct that this is an aliasing tech that they are comparing to an old one? I am sure I watched a vid on it the other day. If it is what I think it is they are not comparing it to their own FXAA which is much faster than other methods.

I'm not sure what you're referring to here? DLSS is a type of super sampling.
 
TBH, it sounds like a more accurate take on the checkerboarding technique that is used on consoles. If the visual downsides are not noticeable in motion it is a huge win, but it is far from being a one-size-fits-all solution.

The fact that Nvidia says they need to crunch numbers on a supercomputer shows how difficult it is to train their AI. Not exactly a technique that I see coming to every game. Still a huge design win for Nvidia.

Why render to a full 4K if you can't notice the difference between it and Nvidia's Faux-K DLSS tech? The only problem with DLSS is that it is Nvidia only. We need something that can work on all GPUs.

To be fair, all modern AI and ML models are run off Super Computers because it's much more efficient to be able to run through billions of iterations in seconds rather than a few thousand using basic Computers. Nothing to do with how difficult it is to train it's just necessary. The more it's done the better it gets also so that's also beneficial
 
Good interview with good info, more down to earth than the marketing type keynotes, and even Peterson says they could've done it better and compare it to the old gens. I always look forward to these with him, the deep dives. Is there some bias going on? Probably. Still, it's the cleanest I'm going to get from Nvidia.

DLSS is the thing that excites me most, as previously mentioned, and I hope most devs opt for it. Only problem is that I have no card to run it on :p
 
Is it me or is this being heavily twisted?

Do I have it correct that this is an aliasing tech that they are comparing to an old one? I am sure I watched a vid on it the other day. If it is what I think it is they are not comparing it to their own FXAA which is much faster than other methods.

I may be confused here, but those figures came from an anus. "The Founder's edition clock to 2.1ghz" well so does my Titan XP beyotch.

tenor.gif
 
Back
Top