Go Back   OC3D Forums > [OC3D] General Forums > OC3D News
Reply
 
Thread Tools Display Modes
 
  #1  
Old 17-08-18, 06:10 PM
WYP's Avatar
WYP WYP is online now
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 14,589
Nvidia RTX 2080 Ti and RTX 2080 specifications leaked

How powerful is Turing?



Read more about Nvidia's RTX 2080 Ti and RTX 2080 leaked specifications.

__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #2  
Old 17-08-18, 06:18 PM
NeverBackDown NeverBackDown is offline
AMD Enthusiast
 
Join Date: Dec 2012
Location: Middle-Earth
Posts: 15,475
Not powerful enough after over 2 years since the first launch pascal card launched.
__________________
The Cost of Freedom -
"A price payed gladly, in the hopes that the free, live better."
Reply With Quote
  #3  
Old 17-08-18, 06:40 PM
WYP's Avatar
WYP WYP is online now
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 14,589
Quote:
Originally Posted by NeverBackDown View Post
Not powerful enough after over 2 years since the first launch pascal card launched.
That depends on how well these RT cores and tensor cores pan out. We will learn more about that on monday.
__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #4  
Old 17-08-18, 06:48 PM
AngryGoldfish's Avatar
AngryGoldfish AngryGoldfish is offline
Old N Gold
 
Join Date: Jan 2015
Location: Ireland
Posts: 2,654
Quote:
Originally Posted by NeverBackDown View Post
Not powerful enough after over 2 years since the first launch pascal card launched.
But for what? We're no longer limited by GPU horsepower for the majority of titles, so it's not that. Innovation for the sake of innovation? It's looking like Turing has a lot of that already and is by no means a refresh. A reason to upgrade? That's entirely personal. Many people didn't bother upgrading to Pascal and are on GTX 970's and 980Ti's. An RTX 2070 could offer a nice performance boost from a 970. But as for Pascal owners, if games actually demanded more performance then what it is on offer here would be more welcome as it would be more needed. Then we go back to the first question: for what? If you're on a GTX 1080 and expected huge things after two years in a market that has virtually no competition with no real die shrink, I don't know where that expectation comes from.

Quite frankly, Turing seems more than I thought it would be.
__________________
ASUS X370 Crosshair VI Hero ⁞⁞ Ryzen 1600X 4Ghz ⁞⁞ Thermalright Le Grand Macho RT ⁞⁞ Aorus GTX 1080 11Gbps ⁞⁞ G.Skill TridentZ 3200Mhz
Jonsbo W2 ⁞⁞ Corsair AX760 ⁞⁞ Pexon PC ⁞⁞ Samsung 960 EVO 250GB & 850 EVO 500GB
⁞⁞ Western Digital 1TB Blue & 3TB Green
BenQ XL2730Z ⁞⁞ Mixonix Naos 7000 ⁞⁞ Corsair K70 Cherry MX Brown ⁞⁞ Audio-GD NFB-15 ⁞⁞ EVE SC205 ⁞⁞ AKG K7XX
Reply With Quote
  #5  
Old 17-08-18, 08:30 PM
WYP's Avatar
WYP WYP is online now
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 14,589
TBH, I think the RTX 2080 Ti's reported 20+% increase in CUDA core count is a decent leap. Assuming the GDDR6 memory runs at 14Gbps, that gives a 27% boost in memory bandwidth.

Nvidia may also offer some GPU clock speed increases or a boost in performance per clock. We don't know enough to judge performance improvements right now.

Beyond that, I think that everyone is dismissing Nvidia's RT a little too quickly. Microsoft doesn't make an API for something unless they see a future in it. As an industry, we ultimately want to move towards ray tracing, Nvidia RTX won't be another PhysX.
__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #6  
Old 17-08-18, 08:50 PM
TheF34RChannel's Avatar
TheF34RChannel TheF34RChannel is offline
OC3D Elite
 
Join Date: Mar 2011
Location: The Netherlands
Posts: 2,078
I sincerely doubt that they've even decided upon the Ti specs yet; I call BS. :P
__________________
Asus Z370 MXH | 8700K 4.7GHz | Corsair H105 | G.Skill Trident Z 3333MHz CL16 16GB | Asus GTX 1080 Strix | EVGA SuperNOVA 850 P2 | Crucial MX200 500 GB | Acer XB270HU 1440p | Win10 Pro x64
We call ourselves legion, for many we be. So hand me your kingdom, I'll take it from here. And dare no one break from, my circles of fear

Reply With Quote
  #7  
Old 17-08-18, 08:54 PM
Chopper3's Avatar
Chopper3 Chopper3 is offline
Member
 
Join Date: May 2013
Location: Christchurch, Dorset
Posts: 184
Does this imply that the Quadro RTX 6000 might actually be quite a good, if expensive, gaming card?
Reply With Quote
  #8  
Old 17-08-18, 08:59 PM
TheF34RChannel's Avatar
TheF34RChannel TheF34RChannel is offline
OC3D Elite
 
Join Date: Mar 2011
Location: The Netherlands
Posts: 2,078
Quote:
Originally Posted by Chopper3 View Post
Does this imply that the Quadro RTX 6000 might actually be quite a good, if expensive, gaming card?
Don't do it mate, don't... The fan will be squealing for sure
Reply With Quote
  #9  
Old 17-08-18, 09:05 PM
svigl svigl is offline
Newbie
 
Join Date: Aug 2018
Posts: 1
Quote:
Originally Posted by AngryGoldfish View Post
But for what? We're no longer limited by GPU horsepower for the majority of titles, so it's not that. Innovation for the sake of innovation? It's looking like Turing has a lot of that already and is by no means a refresh. A reason to upgrade? That's entirely personal. Many people didn't bother upgrading to Pascal and are on GTX 970's and 980Ti's. An RTX 2070 could offer a nice performance boost from a 970. But as for Pascal owners, if games actually demanded more performance then what it is on offer here would be more welcome as it would be more needed. Then we go back to the first question: for what? If you're on a GTX 1080 and expected huge things after two years in a market that has virtually no competition with no real die shrink, I don't know where that expectation comes from.

Quite frankly, Turing seems more than I thought it would be.
Actually, NVIDIA is kinda shafting us here. They moved the naming scheme down 1 model, so for example a 2070 should be a 2060 (previously the 104, now 106), the next generation from a 1060, but the prices aren't moving in the same way. Tho that's not really all new with this generation. Additionally, the cards only have 14Gb/s GDDR6 instead of the 16 that all 3 companies already offer.

And it's not like we don't need powerful GPUs. Unless you're on 1080p, we are limited. Very. We're way past the 1080p era and 4K is here. Running games on 4K is actually pretty difficult if you're shooting for high framerates. You can forget about having high FPS in 4K even if you do find a monitor with high refresh rates. Some (tho very few) AAA games even struggle to get 60FPS in 4K. And even if you move down to 1440p, there's still a lot to be desired. There's just so many games that you can't run at high enough framerates to utilize the high refresh rates on monitors.
Reply With Quote
  #10  
Old 17-08-18, 10:17 PM
AngryGoldfish's Avatar
AngryGoldfish AngryGoldfish is offline
Old N Gold
 
Join Date: Jan 2015
Location: Ireland
Posts: 2,654
Quote:
Originally Posted by svigl View Post
Actually, NVIDIA is kinda shafting us here. They moved the naming scheme down 1 model, so for example a 2070 should be a 2060 (previously the 104, now 106), the next generation from a 1060, but the prices aren't moving in the same way. Tho that's not really all new with this generation. Additionally, the cards only have 14Gb/s GDDR6 instead of the 16 that all 3 companies already offer.

And it's not like we don't need powerful GPUs. Unless you're on 1080p, we are limited. Very. We're way past the 1080p era and 4K is here. Running games on 4K is actually pretty difficult if you're shooting for high framerates. You can forget about having high FPS in 4K even if you do find a monitor with high refresh rates. Some (tho very few) AAA games even struggle to get 60FPS in 4K. And even if you move down to 1440p, there's still a lot to be desired. There's just so many games that you can't run at high enough framerates to utilize the high refresh rates on monitors.
To me it's just a naming scheme. What definitive proof is there that by shifting the RTX 1070 to GT106 (or whatever it is), Nvidia is inherently pocketing money while reducing performance? That's the impression I get from a lot of people, that they feel Nvidia are forever finding new ways to screw consumers over. I feel the same way about the argument that the GTX XX80 (or previously X80) still being a flagship card but with a die size of a midrange card is inherently anti-consumer, that consumers are being shafted with high prices for a small chip. To me, that's highly speculative and damning. Yes, if you were to view things pessimistically and theoretically, it's possible Nvidia have pocketed massive savings while consumers suffered. Yet, here we are with almost all the naysayers happily gaming on their Pascal GPUs. And yet also, no one can actually prove that Nvidia have pocketed all the extra savings they've fobbed off because we don't know the operating costs of researching, developing, manufacturing, advertising, shipping, etc of Pascal or anything else Nvidia do. All we have is a few numbers that we feel should match up to our expectations and a whole lot of bitterness.

Is 16Gbps memory available en mass though at an affordable price?

I disagree that 4k is here. For consoles it's emerging and PC gaming it's ticking along at the same pace it has done for years, but it's still relatively elusive. Many people game at 4k, but only 60 FPS. That's a very manageable number to hit consistently with a 1080Ti. If you insist on every setting at the highest they'll go with every game, which incidentally I don't believe many developers care about you doing, then yes, a 1080Ti won't cut it. But in my opinion we have enough horsepower for current games at everything except 4k/120hz, which is as elusive as 4k/60Hz was five years ago. I would argue that 4k for consoles is growing at a faster rate than for PC gaming.

1440p/144Hz is very possible with a 1080Ti or even Vega 64 or GTX 1080. Simply adjust some of the more superfluous settings and you're there. I recognise that not all gamers are willing to do that, but I won't mistake that for 'not being powerful enough'. Because what exactly is 'not powerful enough'? Millions of gamers are quite happy to game at 960p/30 FPS in some games on console. They experience the stories with huge smiles on their faces. I've regularly and contentedly reduced settings in games to hit the sweet spot of 100 FPS or more with my GTX 1080 at 1440p. The fluidity of 100 FPS is worth more than higher resolution dust particles or the highest anti-aliasing in a fast-paced game. I understand that PC gamers are a particular bunch who like things a certain way, and I respect that, but it comes to a point when it starts to look like a spoilt teenager saying he won't play football for his team unless all the boots match.
__________________
ASUS X370 Crosshair VI Hero ⁞⁞ Ryzen 1600X 4Ghz ⁞⁞ Thermalright Le Grand Macho RT ⁞⁞ Aorus GTX 1080 11Gbps ⁞⁞ G.Skill TridentZ 3200Mhz
Jonsbo W2 ⁞⁞ Corsair AX760 ⁞⁞ Pexon PC ⁞⁞ Samsung 960 EVO 250GB & 850 EVO 500GB
⁞⁞ Western Digital 1TB Blue & 3TB Green
BenQ XL2730Z ⁞⁞ Mixonix Naos 7000 ⁞⁞ Corsair K70 Cherry MX Brown ⁞⁞ Audio-GD NFB-15 ⁞⁞ EVE SC205 ⁞⁞ AKG K7XX
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 07:23 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.