AMD Teases R9 Fury X2 GPU

WYP

News Guru
AMD has started teasing their R9 Fury X2 GPU, a dual Fiji GPU that will be ready for the launch of VR.

12122359664l.jpg


Read more on AMD teasing their upcoming Dual Fiji GPU.
 
I really, really hate to admit it but Crossfire really isn't bad now. I was so expecting to be here whining and moaning about it but I have to give credit to AMD where it's due, it really is every bit as good as SLI now.

Much looking forward to this card :)
 
I really, really hate to admit it but Crossfire really isn't bad now. I was so expecting to be here whining and moaning about it but I have to give credit to AMD where it's due, it really is every bit as good as SLI now.

Much looking forward to this card :)
Do you mean driver support and optimisations? I assume so because you were slightly disappointed with your Fury X Crossfire setup due to the VRAM, weren't you? Or am I thinking of somebody/remembering it wrong?
 
Do you mean driver support and optimisations? I assume so because you were slightly disappointed with your Fury X Crossfire setup due to the VRAM, weren't you? Or am I thinking of somebody/remembering it wrong?

VRAM is most certainly an issue with the Fury range. 4gb for a few games is simply not enough.

Crossfire support however is good.

I really dont know who to blame with regards to VRAM. It's hard to blame AMD considering there are plenty of games out there that play very nicely with 4gb VRAM even at 4k yet others (IE BLOPS III) look like crap and guzzle VRAM.

So is it AMD not putting enough on there to manage poorly optimised games, or, do we blame devs who make fat bloated textures in badly optimised old engines that use too much?
 
VRAM is most certainly an issue with the Fury range. 4gb for a few games is simply not enough.

Crossfire support however is good.

I really dont know who to blame with regards to VRAM. It's hard to blame AMD considering there are plenty of games out there that play very nicely with 4gb VRAM even at 4k yet others (IE BLOPS III) look like crap and guzzle VRAM.

So is it AMD not putting enough on there to manage poorly optimised games, or, do we blame devs who make fat bloated textures in badly optimised old engines that use too much?
I think I understand the conundrum.

But even when more than 4GB is used, the Fury X still competes at higher resolutions against the 980ti. Is it only one or two games where the VRAM is causing the issue?

I remember reading an initial review of the Fury X that was rather negative, saying how the 4GB of VRAM was holding the card back. Yet, at higher resolutions the Fury X was competing against the 980ti and was closing the gap. I noticed other people saying this as well, yet I always thought that the HBM shined when at higher resolutions despite the limitations. How can the 4GB be its limitation, when technically it is using less VRAM than a 980ti, the Fury X still competes and even closes the gap? I don't understand that.
 
When this card was first hinted I told myself I'd get it/the inevitable nVidia equivalent. All this vague hinting is doing is putting me off.
 
The vram isn't an issue. It's only terribly unoptimized games where it is like BO3 and XCom 2.. there's no reason they need 10GB of memory when BF4 or Witcher 3 have no problems at all and are much prettier and complex
 
I think I understand the conundrum.

But even when more than 4GB is used, the Fury X still competes at higher resolutions against the 980ti. Is it only one or two games where the VRAM is causing the issue?

I remember reading an initial review of the Fury X that was rather negative, saying how the 4GB of VRAM was holding the card back. Yet, at higher resolutions the Fury X was competing against the 980ti and was closing the gap. I noticed other people saying this as well, yet I always thought that the HBM shined when at higher resolutions despite the limitations. How can the 4GB be its limitation, when technically it is using less VRAM than a 980ti, the Fury X still competes and even closes the gap? I don't understand that.

I think the general consensus when Fury X was reviewed was that 4gb seemed to be enough and there weren't any problematic apps or games that used more than it had, and that probably due to the speed and bandwidth of the memory used it may just cope with games that use a bit more.

I have found two games that push the card/s over the limit at 4k. One is BLOPS III (but they 'fixed' it by simply removing settings) and the other is Rise of the Tomb Raider. It basically displays the same sort of problems as BLOPS III does (refusing to load, crashing once loaded, massive pauses and so on).

So yes, in a couple of instances the 4gb of HBM clearly isn't enough to cope with the bloated textures being given to it.

Now normally I would say debate closed, point proven, but BLOPS III looks like absolute crap even maxed out with the more than 4gb of textures. It's clearly not the fault of the Fury X it's more that Treyarch used an old engine and filled it with bloated textures. A more modern engine would probably be able to deliver better graphics using less VRAM for sure.

As for Rise of the Tomb Raider? right now I am assuming it is a VRAM issue. Once maxed out the game exhibits all of the same problems BLOPS III did before Treyarch derped the menus and AMD say the following.

[85559] Rise of the Tomb Raider™ – Game may intermittently crash or hang when launched with very high settings and AA is set to SMAA at 4K resolution

Which admits there is a problem yet doesn't say why.
 
Bloated textures!? High VRAM usage?... Graphic texture maps are either 512x512 or 1024x1024, 4k textures are simply scaled to fit the chosen resolution its no biggie on hardware, as for high VRAM usage that's down to bad optimising of game code. 4GB is plenty enough.
 
Bloated textures!? High VRAM usage?... Graphic texture maps are either 512x512 or 1024x1024, 4k textures are simply scaled to fit the chosen resolution its no biggie on hardware, as for high VRAM usage that's down to bad optimising of game code. 4GB is plenty enough.

Yeah I agree :) hopefully we will see less bad games.
 
Perhaps marketing reasons for delaying the launch of Gemini seemed favorable to AMD.
Hopefully, they'll drop the price of the Fury-X when Gemini launches.
Anyone have any clue of Gemini's price tag?
1500 $ US ?
 
I'd go crossfire for multi monitor over SLI - AMD just seem to get their drivers right there and it's so much more flexible with mixed resolutions/PLP or different freeking Sync polarity...

edit - can't comment on 4k as I don't has it.
 
I honestly do not know why they would be releasing this now, considering Polaris isn't really that far away and you are probably going to get the same performance, for far cheaper and use less power.

Granted it will be good for ITX builds but why not just come out with a Dual Polaris cored card instead.

As for textures etc, I think it was Joker Productions that did a video regarding Rise Of The Tomb Raider, and how you could turn down the textures to the minimum and it would drop the Vram usage by around 50% iirc and although, it did look different but not a massive difference, and it didn't seem to affect the performance.

Other games though such as Black Op's 3 if you turn down the settings to the lowest at 4K, iirc it still will use all the vram available just for the sake of it and I have noticed that in other games to which to me just seems stupid.
 
Perhaps their 14nm isnt completely ready at this point.

TBH I wouldnt expect to see a dual 14nm card from AMD very soon regardless.

Do agree that this would have been much better off released earlier.
 
Whatever Polaris is ready it's not high end Polaris. And if it is that's not what AMD will be launching this summer. So far they have hinted that it would be done in the school season (so end of summer or something) and have also hinted that they will be making affordable VR cards. That could mean anything, but IMO it means mid range Polaris competing with the 970 and using hardly any power.

As such two Fury X cores will absolutely batter that.

I bought my first Fury X early (about 8 days after Greg bought it lol) and the second one quite late (about a month ago) and I was thinking about hanging on for Polaris but it could have been a year and that's too long.

What they will release and when they release it is firmly in the hands of Nvidia. If Nvidia are ready and beat them again they will have to rush (like they did with the Fury X). However if they are ready first they will most certainly release first but will do it as Nvidia have (crap first decent stuff later). That's how they can make the most money any way.

Good luck to 'em. I really hope they catch Nvidia off guard and have a release to themselves. They didn't exploit us or take the pee with the 5000 series so I have at least a little more faith in AMD than Nvidia.
 
Seeing as how it was only offering 950 performance with lower power consumption it would be without a doubt the low end.
 
Seeing as how it was only offering 950 performance with lower power consumption it would be without a doubt the low end.

How did you come to that conclusion?

With Vsync on locked to 60hz it did the same thing as the 950 whilst using less power.

That doesn't mean it's the same speed as a 950 dude, far from it ;)
 
How did you come to that conclusion?

With Vsync on locked to 60hz it did the same thing as the 950 whilst using less power.

That doesn't mean it's the same speed as a 950 dude, far from it ;)

Didnt realize the 950 was tested in this thread. Was in reference to a different leak I read.
 
Back
Top