Nvidia finally shares RTX 2080 performance data

I'm all for independent reviews and suspect of those from manufacturers.

DLSS is enabled on the driver side for all games and not in the hardware or game dev side, correct?
 
Last edited:
Funny how we have had all of these "leaks" yet not one "leaker" decided to upload a Firestrike score like we always see every time a GPU is about to release.
 
I'm all for independent reviews and suspect of those from manufacturers.

DLSS is enabled on the driver side for all games and not in the hardware or game dev side, correct?

It needs developer support. The DLSS tech is based on deep learning and therefore requires a lot of "hard truth" data to optimise the algorithm. IE, the feature likely needs to be implemented on a game-by-game basis.

For the sake of speed, DLSS would, in theory, work best when designed with specific games in mind. An accurate one-size-fits-all approach to DLSS would likely be too computationally intensive to be worth it.
 
It needs developer support. The DLSS tech is based on deep learning and therefore requires a lot of "hard truth" data to optimise the algorithm. IE, the feature likely needs to be implemented on a game-by-game basis.

For the sake of speed, DLSS would, in theory, work best when designed with specific games in mind. An accurate one-size-fits-all approach to DLSS would likely be too computationally intensive to be worth it.

Then, considering 99% of games are developed for consoles without the hardware to support it, before being ported to PC, I don't really see this taking off much.

Therefore it's safe to assume none of the shown games here are suddenly coded for DLSS and thus the side shows assumed performance.
 
Last edited:
hYrtgNm.jpg



How many GigaRays do I need for these graphics?
 
It needs developer support. The DLSS tech is based on deep learning and therefore requires a lot of "hard truth" data to optimise the algorithm. IE, the feature likely needs to be implemented on a game-by-game basis.

For the sake of speed, DLSS would, in theory, work best when designed with specific games in mind. An accurate one-size-fits-all approach to DLSS would likely be too computationally intensive to be worth it.

If Nvidia applied Machine Learning and updated it through the driver they could do a one size fits all approach.


As for the performance difference between them this is just another example of manufacturers skewing results. A 1080 at 4k is a horrible option. It will get low FPS. So hitting a 50% improvement shouldn't be difficult to do.
 
Funny how we have had all of these "leaks" yet not one "leaker" decided to upload a Firestrike score like we always see every time a GPU is about to release.

But then Nvidias new cards would be compared to all the others doing normal work like stuff.


Worse still the 2080 Ti won't be able to top the Firestrike tables.:D

If any do appear don't take them too seriously unless it is something like Firestrike Ultra or Timespy Extreme where the GPUs get a real workout.
 
Thinking about it it's even less impressive.

Not only 2 years of waiting for new technology but it's hardly faster.

They should be comparing this to a 1080ti. The price is the closest and when comparing it to that its about 20% faster. That's nothing after 2 years. Terrible performance. Charging us way more for hardly any gain. Compared to the 1080 its only 1.5x faster for over 1.7x the price. You're wasting money.
 
Thinking about it it's even less impressive.

Not only 2 years of waiting for new technology but it's hardly faster.

They should be comparing this to a 1080ti. The price is the closest and when comparing it to that its about 20% faster. That's nothing after 2 years. Terrible performance. Charging us way more for hardly any gain. Compared to the 1080 its only 1.5x faster for over 1.7x the price. You're wasting money.

Since the:

2080 Ti replaces Titan Xp
2080 the 1080 Ti
2070 the 1080/1070 Ti

Yes, they should make those comparisons.

Turing could be a stopover on the way to 7nm (Ampère?), or the delay does not mean more performance.
 
But then Nvidias new cards would be compared to all the others doing normal work like stuff.


Worse still the 2080 Ti won't be able to top the Firestrike tables.:D

If any do appear don't take them too seriously unless it is something like Firestrike Ultra or Timespy Extreme where the GPUs get a real workout.

I have this feeling it will not beat a Titan V.

Are you taking the plunge, Kaap? or avoiding, given you have the V?
 
Since the:

2080 Ti replaces Titan Xp
2080 the 1080 Ti
2070 the 1080/1070 Ti

Yes, they should make those comparisons.

Turing could be a stopover on the way to 7nm (Ampère?), or the delay does not mean more performance.

No they shouldn't. They should compare to it's price class because that is the performance expectation of that class.
 
No they shouldn't. They should compare to it's price class because that is the performance expectation of that class.

Depends on the way one looks at it I suppose. Ideally they should do both. Price classes aren't fixed though, certainly not this round.

@Kaapstad the 12th mem spot, how different or similar is the PCB compared to a Quadro? Was thinking that may be it
 
Back
Top