Alleged RTX 2080 3DMARK Time Spy result leaks

The 2080 is also physically a larger chip than the 1080Ti by 529mm to 471mm squared. The RTX 1080 is NOT a midrange chip from the way I see it. It is a large die, it has many of the advanced features of the flagship GPUs, and costs more than or around the same as the previous flagship. It also has a high TDP.

From the way I see it, Nvidia don't intend on reducing prices. The 'only' thing they've done is, swap the 1080Ti with the RTX 2080 and added Tensor cores and RT abilities. The clock speeds are similar, the prices are the same, the die size is similar, the TDP isn't far off. In other words, Pascal performance per dollar is going to stick around. It's not like previous architectures where you get more performance for less money (970 was as fast as a 780Ti but for less money, 770 was the same chip as the 680 but cheaper, etc). Now, what Nvidia have done is, keep the same performance per dollar, thus shifting everything up a notch, and added features you won't benefit from for a few years. The 2080 quite literally replaces the 1080Ti. Performance per dollar doesn't change. It's almost like Turing doesn't exist. It's just adding another layer of sugary fat to the cake that was already there. It's not a new cake that's better than the previous one. They've just added another layer of flavour.

Someone over at the AnandTech forum also made a valid point by counting production cost; Turing cannot be cheap to produce.
 
Looks a bit suspect, My 1080 Ti score vs the "2080" -

GuvQo1Q.png
 
Looks a bit suspect, My 1080 Ti score vs the "2080" -

GuvQo1Q.png

The claim is that they have an Overclock 2080 vs a stock 1080ti. I guess you OC yours to kingdom come. But yes, I really dont think the leak is close to what the result will be.

Something like this would make Nvidia a laughing stock right now considering the price they want to charge.
 
The claim is that they have an Overclock 2080 vs a stock 1080ti. I guess you OC yours to kingdom come. But yes, I really dont think the leak is close to what the result will be.

Something like this would make Nvidia a laughing stock right now considering the price they want to charge.

Not really, I overclocked it to 2GHz on the core and 1500 on the memory, Quite an unremarkable OC but fans were at 100%.
 
Looking at that your CPU score is almost double dude. I would imagine that means they were using a crappy/stock CPU. That will affect the Graphics score, IMO.

What exact test was that?
 
Kinda makes sense and could also be the reason why we see a 2080Ti now on release instead of months down the line like usual.

They don't have a titan to release on launch with the V already sent out previously.
 
Has anyone watched Jay's video on the prices and stuff? I know many have had criticisms for him in the past, but in general I find he's 'with it', no more or less than anyone else who's clearly not mentally challenged. People accuse him of being a shill, but that's absurd to me. His new video, though... I'm kind of confused, almost dumbfounded. It was fine for the first eight or so minutes, but then he says things that just don't make any sense.

Someone over at the AnandTech forum also made a valid point by counting production cost; Turing cannot be cheap to produce.

Yeah, I bet Turing is really expensive to produce. And I imagine it took a lot to design as well. 'Member when everyone laughed at Nvidia's suggestion that Pascal cost "billions" to design because of how seemingly similar to Maxwell it was (clock for clock it performed the same, just with more advanced compression techniques, driver optimisations, SMP, etc)? I don't think that's the case now. I don't think people doubt how expensive and adventurous Turing seems to be.
 
From what I've seen so far, Specs wise, The 2080 Ti is just a cut down Titan V.

Yup called that one last week. I reckon it should perform around the same, so 15-25% faster than the Titan Xp.. It will also need serious cooling to get the most out of it.

Has anyone watched Jay's video on the prices and stuff? I know many have had criticisms for him in the past, but in general I find he's 'with it', no more or less than anyone else who's clearly not mentally challenged. People accuse him of being a shill, but that's absurd to me. His new video, though... I'm kind of confused, almost dumbfounded. It was fine for the first eight or so minutes, but then he says things that just don't make any sense.

Having spent the best part of 35 years working on computers and being into it I can tell you with some certainty that when it comes to deep down tech Jay really has a lot to learn. I have seen several videos now where he has done things and really did not know what he was doing. Like the time he blew up an AMD CPU for example. I do recall Tom blowing up a SATA controller? on a X58 CPU but to be fair he was under nitrogen and was going for a WR. So it is understandable that you would have casualties. Jay though? he just set the volts too high (he should have known the figures he needed to be at) rebooted the rig and popped the CPU. On a standard board, CPU and cooler.

It irks me when people become celebrities. The whole YT celeb phenomenon really bothers me. Mostly because these are the people that many go to to learn, and if they don't know what they are doing pretty hair cuts and subscribers makes f**k all difference. Now yeah, people will accuse me of being a know all but that is a million miles from the truth. However, I would not, ever, become a YT celeb. Just not me.

If these guys were doing it for the love of the subject, or passion? then fair enough. But is is a *paid job* and to that ends they are being rewarded financially and have a lot at stake.

I just feel that as soon as that happens all brutal honesty vanishes out of the window. I also DO NOT like how they prance around the streets and walk up to people and say "Do you know who I am?". Seriously, you're just some wayne from the internet, get over yourself.

They all kinda consider themselves to be l33t (apart from Tom) but they all have an awful lot to learn. At least Linus is very knowledgeable about servers and workstations etc as well as just the desktop basics. However, waltzing up to people expecting them to know who he is is stupid. That really puts me off, so I don't sub to any of them (I do to Tom, though).
 
Last edited:
Has anyone watched Jay's video on the prices and stuff? I know many have had criticisms for him in the past, but in general I find he's 'with it', no more or less than anyone else who's clearly not mentally challenged. People accuse him of being a shill, but that's absurd to me. His new video, though... I'm kind of confused, almost dumbfounded. It was fine for the first eight or so minutes, but then he says things that just don't make any sense.



Yeah, I bet Turing is really expensive to produce. And I imagine it took a lot to design as well. 'Member when everyone laughed at Nvidia's suggestion that Pascal cost "billions" to design because of how seemingly similar to Maxwell it was (clock for clock it performed the same, just with more advanced compression techniques, driver optimisations, SMP, etc)? I don't think that's the case now. I don't think people doubt how expensive and adventurous Turing seems to be.

That video was awful
 
Indeed, I agree. Jay went full retard. I'm not a fan of his to begin with, but that "Titan reasoning" argument on the pricing was as spastic as it gets. WTF!?!?
 
Indeed, I agree. Jay went full retard. I'm not a fan of his to begin with, but that "Titan reasoning" argument on the pricing was as spastic as it gets. WTF!?!?

Yeah, that's the bit that stands out. The chap from Science Studio, who I think is very down to earth and often does his research well, questioned Jay's logic. I don't know whether Jay responded or not. I think a lot had the same facial expression when listening to that bit: one eyebrow cocked so high it falls off your head.
 
Think what you want, Jay is not wrong; Titan = Ti now. However, with that I'm exiting the debate :)

Anyway, @Kaapstad, something is quite wrong here and the 2080 should score at least better than that.
 
Think what you want, Jay is not wrong; Titan = Ti now. However, with that I'm exiting the debate :)

Anyway, @Kaapstad, something is quite wrong here and the 2080 should score at least better than that.

It's OK if you don't want to respond. I do understand. But I'll just say that the Titan = Ti is not necessarily what I had issue with. The issue is...

Well, for starters, he says it's "OK" that the Titan is now a Ti because now AIB partners can make custom cards for the Tita... I mean, Ti. But that's absurd because the Titan was only ever slightly faster than the Ti and we could always have the Ti as a custom board. The worst part is though, the 2080Ti is still a cutdown Titan GPU. He's calling it a Titan, but it's cut down. So it's not really a 'Titan' because Titans are not cut down. But they are cut down... sometimes. This is why his statements make no sense.

Another huge problem is that he thinks it's OK that the 2080 is cheaper than the 1080Ti by $50. For one, that's rubbish, at least right now. No one can buy a RTX 2080 for less than a 1080Ti, not now, not two years ago. MSRP means pretty much nothing. Vega 64 had an MSRP of $500. Where were those cards? They didn't exist. Secondly, why is an XX80 card being $50 cheaper than the 2 year-old previous flagship anything to gloat about? He thinks this is OK. He thinks it's OK to charge £800 for the 2080 card because it's 20-25% faster than the previous XX80 card and because it has RT and Tensor cores. That's absurd to me. It would need to be 20-25% faster than the Titan Xp to be worth £750 in my opinion.

The 2080 is not "still the 2080"; it's the 1080Ti but with RT and Tensor cores and a higher price tag—not a lower price like it has been with previous generations. The 2070 is not "still the 2070", it's the 1080 (+ maybe 10%) but with RT and Tensor cores and a higher price tag (the 2070 FE will cost UK customers about £600-650 while a GTX 1080 can be had right now for £500)—not a lower price like it has been with previous generations. This is what I find so strange about Jay's video. Pricing is not in line with what we would expect for a new launch, despite what he says at the end. Pricing is extremely high and performance may not mirror that. To me, Nvidia are saying, 'Here's 1080Ti performance for more money, but it has Tensor cores and RT abilities'. That's the 2080. That's not what almost 2.5 years of development did in the past. In the past, Nvidia would be saying, 'Here's 1080Ti performance for less money, but it has Tensor cores and RT abilities. Then the question becomes: can Nvidia actually do that? Maybe not. Maybe Turing is like Vega—late, big, expensive, and out of place.
 
Last edited:
Back
Top