Nvidia reveals the specifications of their Geforce RTX 2080 Ti, RTX 2080 and RTX 2070

So the 2070 is hardly an upgrade from 1070 Ti spec wise if we ignore RTX? I don't think memory bandwidth has been much of a limiting factor on those cards.
 
So the 2070 is hardly an upgrade from 1070 Ti spec wise if we ignore RTX? I don't think memory bandwidth has been much of a limiting factor on those cards.

The performance from the architecture (exlcuding tensor cores) seems to be a much smaller jump than Maxwell to Pascal was. So I would guess the SMX side of things alongside memory improvements probably makes it 10-15% faster. Then add in Tensor cores and it's far more potent. Just my guess anyway.
 
Somebody over at OCUK forums did this as a bit of maths yesterday.

TITAN V: GPU Cores 5120
1455 MHz (Turbo)= 14.9 TFLOP
1900MHz (overclock) = 19.5 TFLOP

RTX 2080 Ti : GPU Cores 4352
1545 MHz (Turbo)= 13.4 TFLOP
2000MHz (overclock ?) = 17.4 TFLOP

GTX1080 Ti : GPU Cores 3584
1582 MHz (Turbo) = 11.3 TFLOP
2000MHz (overclock) = 14.3 TFLOP

RTX 2080 : GPU Cores 2994
1710 MHz (Turbo)= 10.2 TFLOP
2000MHz (overclock ?) = 12 TFLOP

GTX1080 : GPU Cores 2560
1733 MHz (Turbo) = 8.9 TFLOP
2000MHz (overclock) = 10.2 TFLOP

Which may be why Nvidia are hiding the actual performance. They've obviously misreported the "Up to 6x faster" bit completely, as that means in RT which obviously the 10 series are going to be potatoes at because they don't have RT hardware on them.

We'll see I guess.
 
As i have said on some of the posts, just for non-RT tasks upgrading to Touring is not viable in any universe. It is the new stuff that is costing the money.

I would expect similar bump in non-RT tasks as we saw with previous generation. Above calculation (cheers for the guy who did it) has the assumption that CUDA cores have the same performance. I think that new chips will have better optimization, and better performance "per core" than the previous generation. Similar to what we see in CPU world.

Shout to AlienALX I feel your pain, and understand your point. If i ever come to UK I will buy you a beer.
 
The first thing that does need to be said. . . we were obviously playing early versions of the RTX builds of both games, and with relatively early drivers for the RTX 2080 Ti graphics card too.

Tomb raider

We weren’t able to see what settings the game was running at

With the FPS counter on in GFE we could see the game batting between 33fps and 48fps [at 1080p] as standard throughout our playthrough and that highlights just how intensive real-time ray tracing can be on the new GeForce hardware.

While the shadows in my play-time did look pretty good, in that brightly lit instance it’s hard to see where they look that much better than the traditional way that shadows are faked in-game. And to enable the ray traced shadows you’re obviously having to pay a huge performance penalty for the privilege.
I’m not even 100% convinced Shadow of the Tomb Raider was running at max settings at 1080p

Battlefield 5

Playing the new Rotterdam map in Battlefield 5, however, was more convincing of how good real-time ray tracing can make a game look.
And everything has reflections. From the bonnets of cars reflecting the muzzle flash of your rifle, to the puddles on the floor, and the about-to-be-blown-out windows of a Dutch tram reflecting gouts of flame from a red-hot tank. The wooden stock of your gun has low level, ray traced reflections on it, hell, even the watery eyes of your soldier seems to.

But of course there is still a hefty performance hit to the game. . . while we couldn’t bring the fps counter up in the show demo version, we’d bet it wasn’t hitting 60fps either. And in a competitive online shooter visual fidelity is arguably far less important than getting a high frame rate. And running at a higher resolution, without the ray traced reflections, would likely be preferable too as you could actually see more detail at range for those precision shots from downtown.

https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on

I'm sure Jensen just forgot to mention all that!
 
D'you know I was thinking about this last night whilst watching TV in bed, and I wonder just how realistic I want crushing a skull in say Doom to look.

Surely there is going to come a point where all of this violence and gore looks completely real?

https://www.pcgamesn.com/nvidia-rtx-2080-ti-hands-on

I'm sure Jensen just forgot to mention all that!

He's a salesman dude. He turns up, sells something (no matter how true or untrue it may be) and then buggers off having set his trap.

Also. I hope when we see reviews of Turing they also show the Titan V. I am curious to see just how much better it may or may not be.
 
D'you know I was thinking about this last night whilst watching TV in bed, and I wonder just how realistic I want crushing a skull in say Doom to look.

Surely there is going to come a point where all of this violence and gore looks completely real?

Yep, right before the police come and arrest you for wiping out a neighborhood :p But yeah, there may come a time, but not too realistic because that would tic some organizations off.


He's a salesman dude. He turns up, sells something (no matter how true or untrue it may be) and then buggers off having set his trap.

Also. I hope when we see reviews of Turing they also show the Titan V. I am curious to see just how much better it may or may not be.


Ha ha I know! Each time he uttered "I love rays" I wanted to puke, trying to put that into the audience' heads ugh
 
Just watched Adored's latest video. I would watch it, it's very informative.

I don't think these cards are going to perform in the games we have that much better than on the cards we have now tbh. It's all 100% game dependent.
 
Just watched Adored's latest video. I would watch it, it's very informative.

I don't think these cards are going to perform in the games we have that much better than on the cards we have now tbh. It's all 100% game dependent.


We want 7nm :D


I'll go have a look at that video.
 
We want 7nm :D


I'll go have a look at that video.

I am beginning to realise the performance margins could well be pretty small. Why else would Nvidia put up pre orders without providing any performance metrics whatsoever? I don't recall them ever doing that before..

However, the more I think about this the more I end up coming back to the Titan V. It was, at very best, about 25% faster in regular stuff than the 1080Ti. It has more CUDA cores and etc than any Turing GPU..

And that just won't stop resonating in my head. How the jeff have Nvidia managed to better than, in less than a year, with a smaller die with less cores?

Yeah, something just ain't adding up to me.

It's all about RT though isn't it?. I mean, the other day I said that what we had now could mash any game into submission even at 4k, so all we would really want Turing for is RT. A couple of demos until at least next year.

Mind you it doesn't matter what I say really. Nvidia will mop up sales and the world will keep turning.. I just wonder how many of those pre orders will end up being returned? I mean, right now the poll over at OCUK looks like this.

BLuvxwP.jpg


Which to be fair is pretty poor. However, right now we don't even know the performance. How many of those are going to think they are worth what they paid once they see the actual performance?

It could even be lower. At least now it's all excitement and hope....
 
I am beginning to realise the performance margins could well be pretty small. Why else would Nvidia put up pre orders without providing any performance metrics whatsoever? I don't recall them ever doing that before..

However, the more I think about this the more I end up coming back to the Titan V. It was, at very best, about 25% faster in regular stuff than the 1080Ti. It has more CUDA cores and etc than any Turing GPU..

And that just won't stop resonating in my head. How the jeff have Nvidia managed to better than, in less than a year, with a smaller die with less cores?

Yeah, something just ain't adding up to me.

It's all about RT though isn't it?. I mean, the other day I said that what we had now could mash any game into submission even at 4k, so all we would really want Turing for is RT. A couple of demos until at least next year.

Mind you it doesn't matter what I say really. Nvidia will mop up sales and the world will keep turning.. I just wonder how many of those pre orders will end up being returned? I mean, right now the poll over at OCUK looks like this.

Which to be fair is pretty poor. However, right now we don't even know the performance. How many of those are going to think they are worth what they paid once they see the actual performance?

It could even be lower. At least now it's all excitement and hope....


A few times a day I briefly consider picking up a 1080 Ti. Prices haven't fallen a single Euro though so there's little incentive.
 
A few times a day I briefly consider picking up a 1080 Ti. Prices haven't fallen a single Euro though so there's little incentive.

Yeah, that sucks just as much as Turing prices tbh :(

I tell you what though I am watching GPUs go cheap on OCUK MM. Just saw three 1070Tis go for £270 each.

There will be an influx of cheap ex mining cards now, and so long as they have warranty you're sorted really.
 
A few times a day I briefly consider picking up a 1080 Ti. Prices haven't fallen a single Euro though so there's little incentive.

I'm waiting till around $300 before I buy. Unless AMD releases something or a RTX card at $300 is much faster than a 1080. Though I bet the 2070 is a 1080ti equal so I doubt it would be worth it for a side grade.
 
Back
Top