Quick News

Yeah, I think it's a lot easier to market processors or electronics in numbers. I'm surprised AMD stuck with their codename, Vega. I think it works really especially with 56 and 64, but Nvidia are likely to continue with their boring nomenclatures as it's simpler.

Yup, and to think the only reason it is labelled as GTX, was because of a competition where a consumer got to choose the naming.
 
Because 2080 looks better than 1180. If they go with 2080 then they can start moving towards 3080, 4080, etc, which is a cool shift. The reality is, I'm just calling it that as a placeholder. I have no idea what it's actually going to be called. I didn't think they'd call the 1080 a 1080. I though it sounded silly, but that's what they went with so what do I know? :p

I want the Buzz Lightyear Edition :p
 
Because 2080 looks better than 1180. If they go with 2080 then they can start moving towards 3080, 4080, etc, which is a cool shift. The reality is, I'm just calling it that as a placeholder. I have no idea what it's actually going to be called. I didn't think they'd call the 1080 a 1080. I though it sounded silly, but that's what they went with so what do I know? :p

My money was on a name like 'GTX V80' but I guess we're never going to see a Geforce Volta now then?
 
I don't see nVidia putting anything new out in terms of graphics cards, why should they when their current offerings are making them so much money, had Vega been what we all hoped for then we would have seen more from nvidia but right now there is simply no need as they can continue to sell mid range GPUs as their top line
 
Iwhy should they when their current offerings are making them so much money

The exact same reason Intel kept pushing on and shrinking after Bulldozer released - money.

All of the fanboyz will have bought a card on Pascal by now. The only people returning are for an upgrade (but not particularly a fan boy or girl) or people with a failed card. That is small pickings compared to the mass of sales at a launch and Nvidia know this.

If things turn stale they will still sell cards but nowhere near as much. Nobody with a 10 series will be buying another 10 series, if that makes sense.
 
I don't see nVidia putting anything new out in terms of graphics cards, why should they when their current offerings are making them so much money, had Vega been what we all hoped for then we would have seen more from nvidia but right now there is simply no need as they can continue to sell mid range GPUs as their top line

I know what you mean. However, signs are pointing to a new release soon.

What I think will happen is, the new XX80 GPU and XX70 GPUs will only offer marginal increases in performance in DX11 benchmarks, but at a lower TDP in chips that are easier and cheaper for Nvidia to produce. The XX80 could be 5% faster than a Titan Xp and people will still buy it because it'll be a much smaller GPU (and thus easier to cool and cheaper to buy), and it also might have higher DX12 performance—say, 10-15% gain over Titan Xp versus 5% in DX11—which will add sugar to the cake and entice people to upgrade. They could sell the XX70 for £500 and the XX80 £650 and people would buy it because it's faster than a 1080Ti, draws less power, is slightly cheaper, and is new and 'Nvidia-shiny' (which is whole other kind of shiny). That's what Intel were doing, as Alien said, since Bulldozer came along and failed. GPU and CPUs are not the same, but Vega is, right now, the Bulldozer of Radeon. It has room to mature, but so did Bulldozer and by and large didn't.
 
I know what you mean. However, signs are pointing to a new release soon.

What I think will happen is, the new XX80 GPU and XX70 GPUs will only offer marginal increases in performance in DX11 benchmarks, but at a lower TDP in chips that are easier and cheaper for Nvidia to produce. The XX80 could be 5% faster than a Titan Xp and people will still buy it because it'll be a much smaller GPU (and thus easier to cool and cheaper to buy), and it also might have higher DX12 performance—say, 10-15% gain over Titan Xp versus 5% in DX11—which will add sugar to the cake and entice people to upgrade. They could sell the XX70 for £500 and the XX80 £650 and people would buy it because it's faster than a 1080Ti, draws less power, is slightly cheaper, and is new and 'Nvidia-shiny' (which is whole other kind of shiny). That's what Intel were doing, as Alien said, since Bulldozer came along and failed. GPU and CPUs are not the same, but Vega is, right now, the Bulldozer of Radeon. It has room to mature, but so did Bulldozer and by and large didn't.

Depending on the software the performance in DX12 could be huge.

Unfortunately most DX12 games don't err really support DX12.

Here are some Timespy Extreme DX12 scores from another forum. Check out the graphics score of the Volta Titan

https://forums.overclockers.co.uk/posts/31207163/
 
Depending on the software the performance in DX12 could be huge.

Unfortunately most DX12 games don't err really support DX12.

Here are some Timespy Extreme DX12 scores from another forum. Check out the graphics score of the Volta Titan

https://forums.overclockers.co.uk/posts/31207163/

Yeah, those scores look good. The Titan X Maxwell is slightly slower than a GTX 1080, right? Which means that the jump from Titan XM to Titan Xp is around 3200 GFX score (just an estimate; don't know exact score) to 5352. That's an increase of 2150 points. The jump from Titan Xp to Titan V is 2200 points. So that's an improvement.
 
Must get my 1070 out and give it a bash in Timespy Extreme and see if it can beat my Kingpin 980 Ti cards.

It should.

When Wraith got his 1070 ages and ages ago we had a benchmark bash. In DX12 I could not catch the 1070, even at 1458mhz which really was the absolute limit of the card. It wasn't temps, they were fine, but it had no more to give clock speed wise.

Firestrike though the XM was the better card. That was also using Timespy when it first came out, and as we know DX12 can hinder performance somewhat. However, I did read that Nvidia had made DX12 better (better but not perfect) on Pascal.. I think it was one of the tweaks they did before shrinking it.
 
It should.

When Wraith got his 1070 ages and ages ago we had a benchmark bash. In DX12 I could not catch the 1070, even at 1458mhz which really was the absolute limit of the card. It wasn't temps, they were fine, but it had no more to give clock speed wise.

Firestrike though the XM was the better card. That was also using Timespy when it first came out, and as we know DX12 can hinder performance somewhat. However, I did read that Nvidia had made DX12 better (better but not perfect) on Pascal.. I think it was one of the tweaks they did before shrinking it.

My Kingpin 980 Ti (82% ASIC) cards are beasts even on air and can reach 1575/2100 (stock volts and bios). They are fractionally faster than my watercooled Maxwell Titans at low resolutions and fractionally slower @2160p.
 
My Kingpin 980 Ti (82% ASIC) cards are beasts even on air and can reach 1575/2100 (stock volts and bios). They are fractionally faster than my watercooled Maxwell Titans at low resolutions and fractionally slower @2160p.

Could you explain the ASIC quality to me? I see lots of conflicting reports and seeing as you have more cards than 99% of people on earth do I'd like your opinion:)
 
My Kingpin 980 Ti (82% ASIC) cards are beasts even on air and can reach 1575/2100 (stock volts and bios). They are fractionally faster than my watercooled Maxwell Titans at low resolutions and fractionally slower @2160p.

Thing is I wouldn't really class the KP as a 980Ti. Not by any means lol.

Asic quality is supposed to give you better overclocks but was always BS any time I read threads on OCUK.
 
Could you explain the ASIC quality to me? I see lots of conflicting reports and seeing as you have more cards than 99% of people on earth do I'd like your opinion:)

It is not really relevant anymore as you can not read the ASIC value on modern cards anyway.

I did a bench thread for it a long time ago and generally the higher the ASIC the higher it was possible to OC the GPU and less voltage it needed to do it.

Another thing I have noticed is a high ASIC does not mean you are guaranteed the best overclock as sometimes these chips don't like extra voltage at all. A good example of this is the Kingpin 980 Ti cards I mentioned earlier, they are beasts at overclocking on stock volts but don't like any extra at all.

Sometimes you can get the same performance with a low ASIC GPU/extra voltage as you can with a high ASIC card/less voltage where the latter will not take any extra voltage.
 
Back
Top