Alleged RTX 2080 3DMARK Time Spy result leaks

Makes sense given the CUDA cores. I'm more inclined to believe this now that we see some initial specs. Although, it still seems like the GPU clock speeds are an undefined grey area.
 
The clocks are believable for a 2080, large 12nm chips can still reach high clockspeeds.

Here is one of mine, with an air cooled card.

https://www.3dmark.com/3dm/23981974?

Graphics Score 14 598

Of course, I didnt say they were not believable. It's just that everywhere I look at the cards on sale I see TBC in place of the clock specifications.

That tells me that they are either doing this to prevent predictions such as "its only X% faster than a 1080ti" or they finalising the cards with some tweaks and touchups, to ensure a good product launch.

edit* Just saw the gainward golden sample posting a clock speed of 1650hz. Seems pretty decent for a first derivative.
 
Before people mock and write off the card. Remember that while fps doesnt increase. You would be gaining DLSS and realtime RT.

Yes its overpriced but, new tech is new tech.
 
Before people mock and write off the card. Remember that while fps doesnt increase. You would be gaining DLSS and realtime RT.

Yes its overpriced but, new tech is new tech.

DLSS is what I'm most interested in. RT I couldn't care less until two generations further.
 
Mark, is this a typo?

"Today, the lowest priced GTX 1080 Ti on Overclockers UK is £689.99, a mere £26 lower than the cheapest pre-order price for a custom RTX 2080 Ti."


Personally I think this performance is pretty good, but the prices are too prohibitive. To me, while Turing is exciting architecturally, a lot of what really gets my nether regions moist is value for money as well as advanced features. Zen is a perfect example of that. Turing is Vega without any competition and even more advanced (or, expensive).
 
As expected really. I still think the 2080Ti will be about as fast as the Titan V or slower.

And also like f34r I am really not in the mood for this until RT can be done efficiently and quickly. And that will not happen for at least 2 more gens, IMO, so they can booger orf.

I jumped on the 4k train too early and then threw thousands of pounds at it. Never again.
 
Ok, so the industry has been pushing people to buy 4k or high refresh rate monitors for a while, a lot of marketing material talks about 4k performance. Now it's looking like 4k with ray tracing is pretty much unreachable, so do people now have to just down scale resolution just to use RT tech?! The 20XX range is looking more and more like an expensive 1080p GPU for RT, with the 1080ti still looking like the best affordable option for high resolution gaming. Am I missing something?
 
Ok, so the industry has been pushing people to buy 4k or high refresh rate monitors for a while, a lot of marketing material talks about 4k performance. Now it's looking like 4k with ray tracing is pretty much unreachable, so do people now have to just down scale resolution just to use RT tech?! The 20XX range is looking more and more like an expensive 1080p GPU for RT, with the 1080ti still looking like the best affordable option for high resolution gaming. Am I missing something?

No, you use DLSS to reach 4K at better frame rates.
 
Not fussed about DLSS, at 1440p and above you wont see much of a difference. Gamers wanted more FPS at higher resolutions not these gimmicks.
 
What matters most here is that the 2080s now does offer the ray tracing technology. Maybe that's worth investing.
 
What matters most here is that the 2080s now does offer the ray tracing technology. Maybe that's worth investing.

Not yet it isn't. RT is Nvidia's new carrot just like 4k was. Just one issue, the hardware to max it out does not yet exist yet.

It wasn't until pascal xp/1080ti that you could max out 4k at playable FPS and it will be 2-3 gens now until you can do the same with RT. Only this time they want £1100 per gen to chase the dragon.

I don't think RT is a gimmick I just think it will be a while until it's worth having and providing of course they can get enough devs onboard, coupled with the 3yr or so dev cycle.
 
But that's a 2080 compared to a 1080ti - compare apple to apple please

It is. Price vs price. It's closer to a 1080ti price launch than a 1080 price launch. In fact it's more expensive than a 1080ti was. Yet barely faster. It's a worse buy based off that alone while also getting less memory than the 1080ti.
 
The 2080 is also physically a larger chip than the 1080Ti by 529mm to 471mm squared. The RTX 1080 is NOT a midrange chip from the way I see it. It is a large die, it has many of the advanced features of the flagship GPUs, and costs more than or around the same as the previous flagship. It also has a high TDP.

From the way I see it, Nvidia don't intend on reducing prices. The 'only' thing they've done is, swap the 1080Ti with the RTX 2080 and added Tensor cores and RT abilities. The clock speeds are similar, the prices are the same, the die size is similar, the TDP isn't far off. In other words, Pascal performance per dollar is going to stick around. It's not like previous architectures where you get more performance for less money (970 was as fast as a 780Ti but for less money, 770 was the same chip as the 680 but cheaper, etc). Now, what Nvidia have done is, keep the same performance per dollar, thus shifting everything up a notch, and added features you won't benefit from for a few years. The 2080 quite literally replaces the 1080Ti. Performance per dollar doesn't change. It's almost like Turing doesn't exist. It's just adding another layer of sugary fat to the cake that was already there. It's not a new cake that's better than the previous one. They've just added another layer of flavour.
 
Last edited:
The 2080 is also physically a larger chip than the 1080Ti by 529mm to 471mm squared. The RTX 1080 is NOT a midrange chip from the way I see it. It is a large die, it has many of the advanced features of the flagship GPUs, and costs more than or around the same as the previous flagship. It also has a high TDP.

From the way I see it, Nvidia don't intend on reducing prices. The 'only' thing they've done is, swap the 1080Ti with the RTX 2080 and added Tensor cores and RT abilities. The clock speeds are similar, the prices are the same, the die size is similar, the TDP isn't far off. In other words, Pascal performance per dollar is going to stick around. It's not like previous architectures where you get more performance for less money (970 was as fast as a 780Ti but for less money, 770 was the same chip as the 680 but cheaper, etc). Now, what Nvidia have done is, keep the same performance per dollar, thus shifting everything up a notch, and added features you won't benefit from for a few years. The 2080 quite literally replaces the 1080Ti. Performance per dollar doesn't change. It's almost like Turing doesn't exist. It's just adding another layer of sugary fat to the cake that was already there. It's not a new cake that's better than the previous one. They've just added another layer of flavour.

I was about to write my opinion about this matter yet here its pretty much sumerized.

Also when you do a comparison one important aspect is the price. There was a time here in spain where a Vega 64 was 200/250 euros MORE expensive than any 1080.... so for roughly same performance then there is no way some one would think of buying a Vega 64.

Same as comparing a 2080 then its counter part SEEMS to be the 1080ti - price and performance.
 
And I think that is mainly due to a severe lack of competition. Not just a slight lack but a severe lack. Vega, for all intents and purposes, could never have happened and prices for Turing and Pascal would be just the same. I reckon so anyway. Gamers wouldn't be any worse for wear, for the most part. If Vega actually competed where it mattered—1080 and 1080Ti levels—Turing as it currently is would be a very hard sell (like it isn't already).
 
Back
Top