Nvidia RTX 2080 Ti and RTX 2080 specifications leaked

Not powerful enough after over 2 years since the first launch pascal card launched.

But for what? We're no longer limited by GPU horsepower for the majority of titles, so it's not that. Innovation for the sake of innovation? It's looking like Turing has a lot of that already and is by no means a refresh. A reason to upgrade? That's entirely personal. Many people didn't bother upgrading to Pascal and are on GTX 970's and 980Ti's. An RTX 2070 could offer a nice performance boost from a 970. But as for Pascal owners, if games actually demanded more performance then what it is on offer here would be more welcome as it would be more needed. Then we go back to the first question: for what? If you're on a GTX 1080 and expected huge things after two years in a market that has virtually no competition with no real die shrink, I don't know where that expectation comes from.

Quite frankly, Turing seems more than I thought it would be.
 
TBH, I think the RTX 2080 Ti's reported 20+% increase in CUDA core count is a decent leap. Assuming the GDDR6 memory runs at 14Gbps, that gives a 27% boost in memory bandwidth.

Nvidia may also offer some GPU clock speed increases or a boost in performance per clock. We don't know enough to judge performance improvements right now.

Beyond that, I think that everyone is dismissing Nvidia's RT a little too quickly. Microsoft doesn't make an API for something unless they see a future in it. As an industry, we ultimately want to move towards ray tracing, Nvidia RTX won't be another PhysX.
 
But for what? We're no longer limited by GPU horsepower for the majority of titles, so it's not that. Innovation for the sake of innovation? It's looking like Turing has a lot of that already and is by no means a refresh. A reason to upgrade? That's entirely personal. Many people didn't bother upgrading to Pascal and are on GTX 970's and 980Ti's. An RTX 2070 could offer a nice performance boost from a 970. But as for Pascal owners, if games actually demanded more performance then what it is on offer here would be more welcome as it would be more needed. Then we go back to the first question: for what? If you're on a GTX 1080 and expected huge things after two years in a market that has virtually no competition with no real die shrink, I don't know where that expectation comes from.

Quite frankly, Turing seems more than I thought it would be.

Actually, NVIDIA is kinda shafting us here. They moved the naming scheme down 1 model, so for example a 2070 should be a 2060 (previously the 104, now 106), the next generation from a 1060, but the prices aren't moving in the same way. Tho that's not really all new with this generation. Additionally, the cards only have 14Gb/s GDDR6 instead of the 16 that all 3 companies already offer.

And it's not like we don't need powerful GPUs. Unless you're on 1080p, we are limited. Very. We're way past the 1080p era and 4K is here. Running games on 4K is actually pretty difficult if you're shooting for high framerates. You can forget about having high FPS in 4K even if you do find a monitor with high refresh rates. Some (tho very few) AAA games even struggle to get 60FPS in 4K. And even if you move down to 1440p, there's still a lot to be desired. There's just so many games that you can't run at high enough framerates to utilize the high refresh rates on monitors.
 
Actually, NVIDIA is kinda shafting us here. They moved the naming scheme down 1 model, so for example a 2070 should be a 2060 (previously the 104, now 106), the next generation from a 1060, but the prices aren't moving in the same way. Tho that's not really all new with this generation. Additionally, the cards only have 14Gb/s GDDR6 instead of the 16 that all 3 companies already offer.

And it's not like we don't need powerful GPUs. Unless you're on 1080p, we are limited. Very. We're way past the 1080p era and 4K is here. Running games on 4K is actually pretty difficult if you're shooting for high framerates. You can forget about having high FPS in 4K even if you do find a monitor with high refresh rates. Some (tho very few) AAA games even struggle to get 60FPS in 4K. And even if you move down to 1440p, there's still a lot to be desired. There's just so many games that you can't run at high enough framerates to utilize the high refresh rates on monitors.

To me it's just a naming scheme. What definitive proof is there that by shifting the RTX 1070 to GT106 (or whatever it is), Nvidia is inherently pocketing money while reducing performance? That's the impression I get from a lot of people, that they feel Nvidia are forever finding new ways to screw consumers over. I feel the same way about the argument that the GTX XX80 (or previously X80) still being a flagship card but with a die size of a midrange card is inherently anti-consumer, that consumers are being shafted with high prices for a small chip. To me, that's highly speculative and damning. Yes, if you were to view things pessimistically and theoretically, it's possible Nvidia have pocketed massive savings while consumers suffered. Yet, here we are with almost all the naysayers happily gaming on their Pascal GPUs. And yet also, no one can actually prove that Nvidia have pocketed all the extra savings they've fobbed off because we don't know the operating costs of researching, developing, manufacturing, advertising, shipping, etc of Pascal or anything else Nvidia do. All we have is a few numbers that we feel should match up to our expectations and a whole lot of bitterness.

Is 16Gbps memory available en mass though at an affordable price?

I disagree that 4k is here. For consoles it's emerging and PC gaming it's ticking along at the same pace it has done for years, but it's still relatively elusive. Many people game at 4k, but only 60 FPS. That's a very manageable number to hit consistently with a 1080Ti. If you insist on every setting at the highest they'll go with every game, which incidentally I don't believe many developers care about you doing, then yes, a 1080Ti won't cut it. But in my opinion we have enough horsepower for current games at everything except 4k/120hz, which is as elusive as 4k/60Hz was five years ago. I would argue that 4k for consoles is growing at a faster rate than for PC gaming.

1440p/144Hz is very possible with a 1080Ti or even Vega 64 or GTX 1080. Simply adjust some of the more superfluous settings and you're there. I recognise that not all gamers are willing to do that, but I won't mistake that for 'not being powerful enough'. Because what exactly is 'not powerful enough'? Millions of gamers are quite happy to game at 960p/30 FPS in some games on console. They experience the stories with huge smiles on their faces. I've regularly and contentedly reduced settings in games to hit the sweet spot of 100 FPS or more with my GTX 1080 at 1440p. The fluidity of 100 FPS is worth more than higher resolution dust particles or the highest anti-aliasing in a fast-paced game. I understand that PC gamers are a particular bunch who like things a certain way, and I respect that, but it comes to a point when it starts to look like a spoilt teenager saying he won't play football for his team unless all the boots match.
 
Last edited:
But for what? We're no longer limited by GPU horsepower for the majority of titles, so it's not that. Innovation for the sake of innovation? It's looking like Turing has a lot of that already and is by no means a refresh. A reason to upgrade? That's entirely personal. Many people didn't bother upgrading to Pascal and are on GTX 970's and 980Ti's. An RTX 2070 could offer a nice performance boost from a 970. But as for Pascal owners, if games actually demanded more performance then what it is on offer here would be more welcome as it would be more needed. Then we go back to the first question: for what? If you're on a GTX 1080 and expected huge things after two years in a market that has virtually no competition with no real die shrink, I don't know where that expectation comes from.

Quite frankly, Turing seems more than I thought it would be.

Turing is what it should be after over 2 years in regards to innovation.

and yes we are still GPU bound. As we should be. We hardly are using any CPU cores still for the majority of games.

Expectation is entirely reasonable to expect huge gains after 2 years of nothing. We already have had the next gen cycle of the newer cards are 1 tier higher in horsepower but 1 step down in the product range. (ie 980 is 1060, 980ti is 1070, etc).

Yet it seems this is exactly what we are getting again after 2 years with the only big change being we now have NVLink and Tensor cores added onto the die because of the die shrink.
 
Turing is what it should be after over 2 years in regards to innovation.

and yes we are still GPU bound. As we should be. We hardly are using any CPU cores still for the majority of games.

Expectation is entirely reasonable to expect huge gains after 2 years of nothing. We already have had the next gen cycle of the newer cards are 1 tier higher in horsepower but 1 step down in the product range. (ie 980 is 1060, 980ti is 1070, etc).

Yet it seems this is exactly what we are getting again after 2 years with the only big change being we now have NVLink and Tensor cores added onto the die because of the die shrink.

I don't want to be pedantic, but I didn't say we were still GPU bound—in that the CPU is still barely doing anything—I mean to say, the most popular games are very playable at almost any level with current GPUs. Anywhere from budget to enthusiast solutions are, for the most part, perfectly adequate and can offer excellent gaming experiences. Compare that to when the original and first Titan came out, when PC Gamer had Digital Storm build a 4-way Titan SLI machine to run 4k/60Hz. Or when people had to run GTX 680 SLI just to play Crysis 3 at 1080p with anything higher than medium settings.

What did you expect exactly from Turing? I don't mean that argumentatively, I genuinely want to know, because I don't know. Over two years is a long time between architectures, but what if Turing had all been taped out and couldn't realistically be improved any more since its first final design came to be, which could have been six months ago? If it had been finished six months ago and Nvidia were simply sitting on it, that's frustrating, but it would explain why after so long we're 'only' seeing Turing. Which incidentally we know very little about. For all we know Turing is going to be acceptable in the grand scheme of things. It might be one of those, 'all things considered, Turing ain't bad' type of situations.
 
Back
Top